from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/attention.png', width = 1000)
Analysis of text data is of great interest for Computational Biology because biological sequences (DNA, RNA, proteins etc.) can be viewed as a text.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/texts.png', width = 1000)
In the DNA sequence below, blue segments are gene regions and the red segements are intergenic regions.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/dna.png', width = 1000)
A good old way of detecting gene regions is the Hidden Markov Model (HMM), which is based on the concepts of transition and emission probabilities. Briefly, using annotated sets of genic and intergenic sequences and counting how many times A, C, G and T nucleotides are observed in each set, one can compute the emission matrix. Then, one can define a transition matrix via probabilities of changing between genic and intergenic states.
import numpy as np
import pandas as pd
emission = pd.DataFrame(np.array([[0.2910406, 0.2084386, 0.2086294, 0.2918914],
[0.3089565, 0.1911851, 0.1911704, 0.3086880]]),
columns = ["A", "C", "G", "T"], index = ["Gene", "NotGene"])
emission
| A | C | G | T | |
|---|---|---|---|---|
| Gene | 0.291041 | 0.208439 | 0.208629 | 0.291891 |
| NotGene | 0.308957 | 0.191185 | 0.191170 | 0.308688 |
transition = pd.DataFrame(np.array([[0.59, 0.41], [0.40, 0.60]]),
columns = ["Gene", "NotGene"], index = ["Gene", "NotGene"])
transition
| Gene | NotGene | |
|---|---|---|
| Gene | 0.59 | 0.41 |
| NotGene | 0.40 | 0.60 |
Finally, the Viterbi algorithm, which is the main HMM algorithm, uses a product of the emission and transition matrices in order to infer whether a certain sequence belongs to gene or intergenic region.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/HMM_gene_prediction.png', width = 1000)
Similarly one can use the concept of Markov Chain and transition matrix for generating a DNA sequence. Below we compute the Markov Chain transition matrix which holds the probabilities of observing any of the four nucleotide, i.e. A, C, G and T, after each of the four nucleotide. Also, please note that the interpretation of Markov Chain transition matrix is slightly different from HMM transition matrix above.
markov_chain_transition = pd.DataFrame(np.array([[0.30, 0.20, 0.30, 0.20],
[0.25, 0.25, 0.25, 0.25],
[0.25, 0.25, 0.25, 0.25],
[0.20, 0.20, 0.30, 0.30]]),
columns = ["A", "C", "G", "T"], index = ["A", "C", "G", "T"])
markov_chain_transition
| A | C | G | T | |
|---|---|---|---|---|
| A | 0.30 | 0.20 | 0.30 | 0.20 |
| C | 0.25 | 0.25 | 0.25 | 0.25 |
| G | 0.25 | 0.25 | 0.25 | 0.25 |
| T | 0.20 | 0.20 | 0.30 | 0.30 |
Now, given an initial nucleotide, we can draw next nucleotides from multinomial (4-classes: A, C, G and T) distribution, and generate a nucleotide sequence of certain length as a Markov Chain.
current_nuc = 'C'
nucs = ["A", "C", "G", "T"]
next_nuc_list = []
for i in range(100):
next_nuc = np.array(nucs)[np.random.multinomial(1, list(markov_chain_transition.loc[current_nuc, :])) == 1][0]
current_nuc = next_nuc
next_nuc_list.append(next_nuc)
''.join(next_nuc_list)
'TGTGCATTGTGCCTCAGGGCATGACTGACACCCAGTGCCGGGGAGTGATTTTATAGAGCTCATTTAGCGTGAGTCGCCTTCCTATATGAAACTCAGCTGA'
The obtained nucleotide sequence should have nucleotide and dinucleotide frequencies similar to the real human DNA.
Finally, before going further, let us ask: if DNA sequence is a text, what would be then the words? It turns out that k-mers can be considered as words, i.e. building block of the DNA sentence / sequence. Indeed, sliding a window of a fixed size (k-mer size) along the DNA sequence of length L, we can represent it as L-k+1 words, which we for convienicence separate with spaces, see the figure below. Then words / k-mer frequencies can be used for a sentiment-like analysis, i.e. comparision of word / k-mer usage between case and control DNA, as in the Bag of Words NLP method.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/kmers.png', width = 1000)
Also, the words / k-mers can be represented as numeric vectors of arbitrary length (so-called Embeddings, which can be achieved with a trained Embeddings layer of a neural network) as it is implemented in the Word2Vec pre-trained NLP model which can also keep some relation by semntics between the words.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/tSNE_kmers.png', width = 1000)
In the tSNE plot above, the words / k-mers are visualized in 2D space, where it is clear that AT-rich and GC-rish k-mers form two distinct clusters. Please keep in mind this possibility to encode words / k-mers as embedding vectors, we will use this later when explaining the concept of Attention.
Now, after we have explained the relevance of textual data analysis for Computational Biology, and demonstrated how a DNA sequence can be generated from Markov process given transition matrix, let us exercise a bit with real texts. In this section we will demonstrate how to generate text with a simple short-memory process such as Markov Chain that remembers only the previous word in a text. For demonstration purposes, we will be using the text from the first part of "Crime and Punishment" novel by Fedor Dostoevsky, which can be copied from the Gutenberg project.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/dostoevsky.png', width = 1000)
We can use the whole text, but to properly visualize the transition matrix it is helpful to start with just one (not even compete) sentence.
import numpy as np
import pandas as pd
sentence = 'The landlady who provided him with garret, dinners, and attendance, lived on the floor \
below, and every time he went out he was'
sentence
#sentence = 'The landlady who provided him with garret, dinners, and attendance, lived on the floor \
#below, and every time he went out he was obliged to pass her kitchen, the door of which invariably stood open'
#sentence = open('crime_and_punishment.txt', encoding = 'utf8').read()
#sentence[0:1000]
'The landlady who provided him with garret, dinners, and attendance, lived on the floor below, and every time he went out he was'
tokens = sentence.split()
tokens[0:8]
['The', 'landlady', 'who', 'provided', 'him', 'with', 'garret,', 'dinners,']
After we have tokenized the sentence (please note that since we are aiming at text generation, in contrast to a typical NLP pre-processing, we are not going to remove punctution and stop words and use lemmatization), we can create a transition matrix by counting for each word, how many times one word follows another word. Then, after per word normalization by the total sum of observed word pairs, we get the transition matrix below, where the elements have an interpretation of probabilities of observing two words in a certain order.
transition_matrix = np.zeros((len(tokens), len(tokens)))
for i in range(len(tokens)-1):
transition_matrix[i, (i + 1)] += 1
transition_df = (pd.DataFrame(transition_matrix, index = tokens, columns = tokens)).T
transition_df = transition_df.groupby(transition_df.columns, axis = 1).sum()
transition_df = transition_df.div(transition_df.sum(), axis = 1)
transition_df = transition_df.fillna(0)
transition_df
| The | and | attendance, | below, | dinners, | every | floor | garret, | he | him | ... | lived | on | out | provided | the | time | was | went | who | with | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| The | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| landlady | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| who | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| provided | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 |
| him | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| with | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| garret, | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| dinners, | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| and | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| attendance, | 0.0 | 0.5 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| lived | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| on | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| the | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| floor | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| below, | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| and | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| every | 0.0 | 0.5 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| time | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| he | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| went | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| out | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
| he | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ... | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| was | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5 | 0.0 | ... | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
23 rows × 21 columns
transition_df.shape
(23, 21)
Please note that all words but "and" and "he" have only one following word. In contrast, the word "and" can have two following words: "attendance," and "every", while "he" can have "went" and "was" with 50% chance of observing them following "he". Now, given the transition matrix, we can specify an initial word (for example, "provided"), and randomly draw next words from the multinomial 21-class distribution, similarly to the way we previously generated the nucleotide sequence.
N_words = 10
current_w = 'provided'
markov_chain = [current_w]
for i in range(N_words):
next_w = np.array(tokens)[np.random.multinomial(1, list(transition_df.loc[:, current_w])) == 1][0]
markov_chain.append(next_w)
current_w = next_w
' '.join(markov_chain)
'provided him with garret, dinners, and every time he went out'
We get almost a perfect text, especially in the beginning. However, please note how the meaning has slightly changed starting from the word "and". More specifically, instead of "... with garret, diners, and attendance ...", the sentance flipped to "... with garret, diners, and every time ...". This is because the Markov Chain had 50% chance to select either "attendance" (perfect match) or "every" (sub-optimal), and it selected "every". However, providing more text would likely improve the performance as it will be more and more obvious which word (i.e. "attendance" or "every") is more typical in this context. The counts in the transition matrix will grow with the amount of text.
In this section, we will demonstrate how to generate text with an LSTM neural network which presumably has a longer memory and can generate more meaningful texts. The design of LSTM was borrowed from materials avilable here and here. Again, for simplicity and more accurate comparison with the Markov Chain text generator, we will use not the whole text from part 1 of "Crime and Punishment" (we can if we want!) but only first three paragraphs with only a few of sentences.
import numpy as np
data = open('crime_and_punishment.txt').read()
corpus = data.lower().split("\n")
corpus = corpus[0:5]
corpus
['on an exceptionally hot evening early in july a young man came out of the garret in which he lodged in s. place and walked slowly, as though in hesitation, towards k. bridge.', '', 'he had successfully avoided meeting his landlady on the staircase. his garret was under the roof of a high, five-storied house and was more like a cupboard than a room. the landlady who provided him with garret, dinners, and attendance, lived on the floor below, and every time he went out he was obliged to pass her kitchen, the door of which invariably stood open. and each time he passed, the young man had a sick, frightened feeling, which made him scowl and feel ashamed. he was hopelessly in debt to his landlady, and was afraid of meeting her.', '', 'this was not because he was cowardly and abject, quite the contrary; but for some time past he had been in an overstrained irritable condition, verging on hypochondria. he had become so completely absorbed in himself, and isolated from his fellows that he dreaded meeting, not only his landlady, but anyone at all. he was crushed by poverty, but the anxieties of his position had of late ceased to weigh upon him. he had given up attending to matters of practical importance; he had lost all desire to do so. nothing that any landlady could do had a real terror for him. but to be stopped on the stairs, to be forced to listen to her trivial, irrelevant gossip, to pestering demands for payment, threats and complaints, and to rack his brains for excuses, to prevaricate, to lie—no, rather than that, he would creep down the stairs like a cat and slip out unseen.']
Please note that we are not going to remove punctuation, stop words, and do stemming and lemmatization, i.e. we are not following the standard NLP text pre-processing steps. This is because our task is text generation, which does not require extensive cleaing of the input text.
Text generation requires a sequence input data, as given a sequence (of words / tokens) the aim is the predict next word / token.
Next, we need to tokenize our corpus. Tokenization is a process of extracting tokens (terms / words) from a corpus and encoding them as numbers. One can either manually tokenize the corpus or use special tool available at NLTK and Keras libraries. Keras has inbuilt model Tokenizer for tokenization which can be used to obtain the tokens and their index in the corpus.
from tensorflow.keras.preprocessing.text import Tokenizer
tokenizer = Tokenizer()
tokenizer.fit_on_texts(corpus)
total_words = len(tokenizer.word_index) + 1
total_words
162
We can see that we have more then hundred fifty words in the three first paragraphs of "Crime and Punishment". Now, once Tokenizer has been applied to the corpus, it can provide multiple outputs such as word counts, co-occurence matrix etc. Here, however, we are interested in one particular form of text which is a sequence of tokens, i.e. the corpus will be represented as a list of lists (one lists per sentence) where each word was converted into an integer.
input_sequences = []
for line in corpus:
token_list = tokenizer.texts_to_sequences([line])[0]
for i in range(1, len(token_list)):
n_gram_sequence = token_list[:i+1]
input_sequences.append(n_gram_sequence)
input_sequences[0:10]
[[11, 23], [11, 23, 34], [11, 23, 34, 35], [11, 23, 34, 35, 36], [11, 23, 34, 35, 36, 37], [11, 23, 34, 35, 36, 37, 7], [11, 23, 34, 35, 36, 37, 7, 38], [11, 23, 34, 35, 36, 37, 7, 38, 8], [11, 23, 34, 35, 36, 37, 7, 38, 8, 24], [11, 23, 34, 35, 36, 37, 7, 38, 8, 24, 25]]
In the above output, [11, 23], [11, 23, 34], [11, 23, 34, 35] and so on represent the N-gram phrases generated from the input text data, where every integer corresponds to the index of a particular word in the complete vocabulary of words present in the text. we need to prepare the data in this way as N-grams for the particular task of text generation. Later, we will use the last word as a label, and all the words prior the last word as data. Therefore, the task will be to use all the words prior the last word in order to predict the last word in a sentence.
After this step, every text document in the dataset is converted into sequence of tokens. Now that we have generated a data-set which contains sequence of tokens, it is possible that different sequences have different lengths. Therefore, before starting training the model, we need to pad the sequences and make their lengths equal. We can use pad_sequence function of Kears for this purpose.
from tensorflow.keras.preprocessing.sequence import pad_sequences
max_sequence_len = max([len(x) for x in input_sequences])
input_sequences = np.array(pad_sequences(input_sequences, maxlen = max_sequence_len, padding = 'pre'))
input_sequences
array([[ 0, 0, 0, ..., 0, 11, 23],
[ 0, 0, 0, ..., 11, 23, 34],
[ 0, 0, 0, ..., 23, 34, 35],
...,
[ 0, 0, 92, ..., 159, 3, 160],
[ 0, 92, 6, ..., 3, 160, 16],
[ 92, 6, 28, ..., 160, 16, 161]], dtype=int32)
To input this data into a learning model, we need to create predictors and label. We will create N-grams sequence as predictors and the next word of the N-gram as label.
import tensorflow.keras.utils as ku
predictors, label = input_sequences[:,:-1],input_sequences[:,-1]
label = ku.to_categorical(label, num_classes = total_words)
label
array([[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
...,
[0., 0., 0., ..., 0., 1., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 1.]], dtype=float32)
Let us build a Sequential Keras model with the first layer as the word Embedding layer. Then, we will apply a bidirectional LSTM layer of neural net, where parameter return_sequence is marked as True so that the word generation keeps in consideration previous and the words coming ahead in the sequence. A dropout layer is added to avoid overfitting, one more LSTM layer and, one more dense layer with activation as Relu, and a regularizer to avoid over-fitting again are then added. The output layer has softmax so as to get the probability of the word to be predicted next.
from tensorflow.keras import regularizers
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, LSTM, Dense, Dropout, Bidirectional
model = Sequential()
model.add(Embedding(total_words, 100, input_length=max_sequence_len-1))
model.add(Bidirectional(LSTM(150, return_sequences = True)))
model.add(Dropout(0.2))
model.add(LSTM(100))
model.add(Dense(total_words/2, activation='relu', kernel_regularizer=regularizers.l2(0.01)))
model.add(Dense(total_words, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= embedding (Embedding) (None, 153, 100) 16200 _________________________________________________________________ bidirectional (Bidirectional (None, 153, 300) 301200 _________________________________________________________________ dropout (Dropout) (None, 153, 300) 0 _________________________________________________________________ lstm_1 (LSTM) (None, 100) 160400 _________________________________________________________________ dense (Dense) (None, 81) 8181 _________________________________________________________________ dense_1 (Dense) (None, 162) 13284 ================================================================= Total params: 499,265 Trainable params: 499,265 Non-trainable params: 0 _________________________________________________________________
history = model.fit(predictors, label, epochs = 200, verbose = 1)
Epoch 1/200 9/9 [==============================] - 7s 370ms/step - loss: 5.9460 - accuracy: 0.0173 Epoch 2/200 9/9 [==============================] - 3s 361ms/step - loss: 5.7952 - accuracy: 0.0780 Epoch 3/200 9/9 [==============================] - 3s 369ms/step - loss: 5.5376 - accuracy: 0.0391 Epoch 4/200 9/9 [==============================] - 3s 363ms/step - loss: 5.3788 - accuracy: 0.0182 Epoch 5/200 9/9 [==============================] - 3s 357ms/step - loss: 5.2783 - accuracy: 0.0334 Epoch 6/200 9/9 [==============================] - 3s 363ms/step - loss: 5.2347 - accuracy: 0.0270 Epoch 7/200 9/9 [==============================] - 3s 363ms/step - loss: 5.1328 - accuracy: 0.0408 Epoch 8/200 9/9 [==============================] - 3s 357ms/step - loss: 5.0643 - accuracy: 0.0380 Epoch 9/200 9/9 [==============================] - 3s 365ms/step - loss: 4.9888 - accuracy: 0.0515 Epoch 10/200 9/9 [==============================] - 4s 406ms/step - loss: 4.9924 - accuracy: 0.0377 Epoch 11/200 9/9 [==============================] - 4s 409ms/step - loss: 4.8952 - accuracy: 0.0423 Epoch 12/200 9/9 [==============================] - 4s 420ms/step - loss: 4.9024 - accuracy: 0.0566 Epoch 13/200 9/9 [==============================] - 4s 426ms/step - loss: 4.8820 - accuracy: 0.0425 Epoch 14/200 9/9 [==============================] - 4s 410ms/step - loss: 4.8940 - accuracy: 0.0397 Epoch 15/200 9/9 [==============================] - 4s 419ms/step - loss: 4.8596 - accuracy: 0.0531 Epoch 16/200 9/9 [==============================] - 4s 411ms/step - loss: 4.8764 - accuracy: 0.0576 Epoch 17/200 9/9 [==============================] - 4s 400ms/step - loss: 4.6359 - accuracy: 0.0861 Epoch 18/200 9/9 [==============================] - 4s 435ms/step - loss: 4.5469 - accuracy: 0.0644 Epoch 19/200 9/9 [==============================] - 4s 417ms/step - loss: 4.6120 - accuracy: 0.0426 Epoch 20/200 9/9 [==============================] - 4s 415ms/step - loss: 4.5569 - accuracy: 0.0505 Epoch 21/200 9/9 [==============================] - 4s 416ms/step - loss: 4.5560 - accuracy: 0.0383 Epoch 22/200 9/9 [==============================] - 4s 399ms/step - loss: 4.4841 - accuracy: 0.0450 Epoch 23/200 9/9 [==============================] - 4s 437ms/step - loss: 4.4239 - accuracy: 0.0357 Epoch 24/200 9/9 [==============================] - 4s 424ms/step - loss: 4.2684 - accuracy: 0.0805 Epoch 25/200 9/9 [==============================] - 4s 407ms/step - loss: 4.2483 - accuracy: 0.0741 Epoch 26/200 9/9 [==============================] - 4s 398ms/step - loss: 4.1005 - accuracy: 0.0888 Epoch 27/200 9/9 [==============================] - 4s 433ms/step - loss: 4.1354 - accuracy: 0.0694 Epoch 28/200 9/9 [==============================] - 4s 424ms/step - loss: 4.0232 - accuracy: 0.0743 Epoch 29/200 9/9 [==============================] - 4s 418ms/step - loss: 4.1505 - accuracy: 0.0661 Epoch 30/200 9/9 [==============================] - 4s 424ms/step - loss: 3.9270 - accuracy: 0.0847 Epoch 31/200 9/9 [==============================] - 4s 421ms/step - loss: 3.8697 - accuracy: 0.1192 Epoch 32/200 9/9 [==============================] - 4s 424ms/step - loss: 3.9045 - accuracy: 0.1230 Epoch 33/200 9/9 [==============================] - 4s 412ms/step - loss: 3.7678 - accuracy: 0.1388 Epoch 34/200 9/9 [==============================] - 4s 424ms/step - loss: 3.7595 - accuracy: 0.0947 Epoch 35/200 9/9 [==============================] - 4s 413ms/step - loss: 3.5744 - accuracy: 0.2077 Epoch 36/200 9/9 [==============================] - 4s 415ms/step - loss: 3.5913 - accuracy: 0.1729 Epoch 37/200 9/9 [==============================] - 4s 423ms/step - loss: 3.4843 - accuracy: 0.1958 Epoch 38/200 9/9 [==============================] - 4s 427ms/step - loss: 3.5048 - accuracy: 0.1911 Epoch 39/200 9/9 [==============================] - 4s 414ms/step - loss: 3.4755 - accuracy: 0.1978 Epoch 40/200 9/9 [==============================] - 4s 413ms/step - loss: 3.2728 - accuracy: 0.2537 Epoch 41/200 9/9 [==============================] - 4s 425ms/step - loss: 3.1623 - accuracy: 0.2540 Epoch 42/200 9/9 [==============================] - 4s 438ms/step - loss: 3.1389 - accuracy: 0.2715 Epoch 43/200 9/9 [==============================] - 4s 417ms/step - loss: 3.0197 - accuracy: 0.2786 Epoch 44/200 9/9 [==============================] - 4s 420ms/step - loss: 2.9578 - accuracy: 0.2996 Epoch 45/200 9/9 [==============================] - 4s 428ms/step - loss: 2.8803 - accuracy: 0.3059 Epoch 46/200 9/9 [==============================] - 4s 440ms/step - loss: 2.7960 - accuracy: 0.3204 Epoch 47/200 9/9 [==============================] - 4s 414ms/step - loss: 2.9190 - accuracy: 0.3155 Epoch 48/200 9/9 [==============================] - 4s 438ms/step - loss: 2.7072 - accuracy: 0.3596 Epoch 49/200 9/9 [==============================] - 4s 440ms/step - loss: 2.6474 - accuracy: 0.3360 Epoch 50/200 9/9 [==============================] - 4s 451ms/step - loss: 2.5514 - accuracy: 0.3879 Epoch 51/200 9/9 [==============================] - 3s 362ms/step - loss: 2.5077 - accuracy: 0.3790 Epoch 52/200 9/9 [==============================] - 4s 428ms/step - loss: 2.5400 - accuracy: 0.4108 Epoch 53/200 9/9 [==============================] - 4s 442ms/step - loss: 2.4180 - accuracy: 0.4251 Epoch 54/200 9/9 [==============================] - 4s 455ms/step - loss: 2.4841 - accuracy: 0.3943 Epoch 55/200 9/9 [==============================] - 4s 429ms/step - loss: 2.3302 - accuracy: 0.4488 Epoch 56/200 9/9 [==============================] - 4s 426ms/step - loss: 2.3441 - accuracy: 0.4030 Epoch 57/200 9/9 [==============================] - 4s 426ms/step - loss: 2.3195 - accuracy: 0.4192 Epoch 58/200 9/9 [==============================] - 4s 448ms/step - loss: 2.2171 - accuracy: 0.4788 Epoch 59/200 9/9 [==============================] - 3s 370ms/step - loss: 2.1955 - accuracy: 0.4972 Epoch 60/200 9/9 [==============================] - 4s 413ms/step - loss: 2.2571 - accuracy: 0.4674 Epoch 61/200 9/9 [==============================] - 4s 460ms/step - loss: 2.3791 - accuracy: 0.4417 Epoch 62/200 9/9 [==============================] - 4s 449ms/step - loss: 2.2404 - accuracy: 0.4498 Epoch 63/200 9/9 [==============================] - 4s 460ms/step - loss: 2.1207 - accuracy: 0.5319 Epoch 64/200 9/9 [==============================] - 4s 417ms/step - loss: 2.0516 - accuracy: 0.5434 Epoch 65/200 9/9 [==============================] - 4s 428ms/step - loss: 2.0795 - accuracy: 0.5249 Epoch 66/200 9/9 [==============================] - 4s 450ms/step - loss: 2.0782 - accuracy: 0.5122 Epoch 67/200 9/9 [==============================] - 3s 364ms/step - loss: 1.9538 - accuracy: 0.5433 Epoch 68/200 9/9 [==============================] - 4s 409ms/step - loss: 1.9565 - accuracy: 0.5918 Epoch 69/200 9/9 [==============================] - 4s 467ms/step - loss: 1.9042 - accuracy: 0.5890 Epoch 70/200 9/9 [==============================] - 4s 451ms/step - loss: 1.9079 - accuracy: 0.5730 Epoch 71/200 9/9 [==============================] - 4s 459ms/step - loss: 1.8346 - accuracy: 0.6009 Epoch 72/200 9/9 [==============================] - 4s 424ms/step - loss: 1.8301 - accuracy: 0.5834 Epoch 73/200 9/9 [==============================] - 4s 450ms/step - loss: 1.8112 - accuracy: 0.5840 Epoch 74/200 9/9 [==============================] - 4s 446ms/step - loss: 1.7823 - accuracy: 0.6321 Epoch 75/200 9/9 [==============================] - 3s 369ms/step - loss: 1.7811 - accuracy: 0.6259 Epoch 76/200 9/9 [==============================] - 3s 366ms/step - loss: 1.7273 - accuracy: 0.6697 Epoch 77/200 9/9 [==============================] - 4s 496ms/step - loss: 1.7797 - accuracy: 0.6638 Epoch 78/200 9/9 [==============================] - 4s 425ms/step - loss: 1.8127 - accuracy: 0.6315 Epoch 79/200 9/9 [==============================] - 4s 458ms/step - loss: 1.7443 - accuracy: 0.6569 Epoch 80/200 9/9 [==============================] - 4s 450ms/step - loss: 1.6929 - accuracy: 0.6910 Epoch 81/200 9/9 [==============================] - 4s 463ms/step - loss: 1.5894 - accuracy: 0.6713 Epoch 82/200 9/9 [==============================] - 4s 422ms/step - loss: 1.5975 - accuracy: 0.6991 Epoch 83/200 9/9 [==============================] - 4s 425ms/step - loss: 1.5913 - accuracy: 0.7078 Epoch 84/200 9/9 [==============================] - 5s 512ms/step - loss: 1.5940 - accuracy: 0.6846 Epoch 85/200 9/9 [==============================] - 5s 514ms/step - loss: 1.6216 - accuracy: 0.7141 Epoch 86/200 9/9 [==============================] - 4s 423ms/step - loss: 1.5545 - accuracy: 0.7215 Epoch 87/200 9/9 [==============================] - 4s 454ms/step - loss: 1.5133 - accuracy: 0.7059 Epoch 88/200 9/9 [==============================] - 4s 421ms/step - loss: 1.4730 - accuracy: 0.7768 Epoch 89/200 9/9 [==============================] - 4s 419ms/step - loss: 1.4513 - accuracy: 0.7472 Epoch 90/200 9/9 [==============================] - 4s 427ms/step - loss: 1.4625 - accuracy: 0.7638 Epoch 91/200 9/9 [==============================] - 4s 458ms/step - loss: 1.4116 - accuracy: 0.7903 Epoch 92/200 9/9 [==============================] - 4s 430ms/step - loss: 1.4651 - accuracy: 0.7804 Epoch 93/200 9/9 [==============================] - 4s 421ms/step - loss: 1.4321 - accuracy: 0.7963 Epoch 94/200 9/9 [==============================] - 4s 435ms/step - loss: 1.4028 - accuracy: 0.8387 Epoch 95/200 9/9 [==============================] - 4s 459ms/step - loss: 1.3582 - accuracy: 0.8080 Epoch 96/200 9/9 [==============================] - 4s 419ms/step - loss: 1.3718 - accuracy: 0.8233 Epoch 97/200 9/9 [==============================] - 4s 459ms/step - loss: 1.3471 - accuracy: 0.8192 Epoch 98/200 9/9 [==============================] - 4s 420ms/step - loss: 1.3673 - accuracy: 0.7971 Epoch 99/200 9/9 [==============================] - 4s 421ms/step - loss: 1.3509 - accuracy: 0.8209 Epoch 100/200 9/9 [==============================] - 4s 424ms/step - loss: 1.3289 - accuracy: 0.7970 Epoch 101/200 9/9 [==============================] - 4s 453ms/step - loss: 1.3521 - accuracy: 0.8252 Epoch 102/200 9/9 [==============================] - 4s 423ms/step - loss: 1.3056 - accuracy: 0.8274 Epoch 103/200 9/9 [==============================] - 4s 454ms/step - loss: 1.2420 - accuracy: 0.8454 Epoch 104/200 9/9 [==============================] - 4s 436ms/step - loss: 1.2696 - accuracy: 0.8410 Epoch 105/200 9/9 [==============================] - 4s 418ms/step - loss: 1.1881 - accuracy: 0.8911 Epoch 106/200 9/9 [==============================] - 4s 457ms/step - loss: 1.2276 - accuracy: 0.8681 Epoch 107/200 9/9 [==============================] - 4s 421ms/step - loss: 1.1692 - accuracy: 0.8717 Epoch 108/200 9/9 [==============================] - 4s 426ms/step - loss: 1.1759 - accuracy: 0.8801 Epoch 109/200 9/9 [==============================] - 4s 453ms/step - loss: 1.2121 - accuracy: 0.8801 Epoch 110/200 9/9 [==============================] - 4s 421ms/step - loss: 1.2420 - accuracy: 0.8687 Epoch 111/200 9/9 [==============================] - 4s 425ms/step - loss: 1.2551 - accuracy: 0.8748 Epoch 112/200 9/9 [==============================] - 4s 458ms/step - loss: 1.2309 - accuracy: 0.8567 Epoch 113/200 9/9 [==============================] - 4s 443ms/step - loss: 1.1519 - accuracy: 0.8945 Epoch 114/200 9/9 [==============================] - 4s 426ms/step - loss: 1.1413 - accuracy: 0.8629 Epoch 115/200 9/9 [==============================] - 4s 441ms/step - loss: 1.1642 - accuracy: 0.8875 Epoch 116/200 9/9 [==============================] - 4s 462ms/step - loss: 1.1395 - accuracy: 0.9002 Epoch 117/200 9/9 [==============================] - 4s 420ms/step - loss: 1.0965 - accuracy: 0.9028 Epoch 118/200 9/9 [==============================] - 4s 460ms/step - loss: 1.1295 - accuracy: 0.9139 Epoch 119/200 9/9 [==============================] - 4s 429ms/step - loss: 1.1278 - accuracy: 0.8913 Epoch 120/200 9/9 [==============================] - 4s 456ms/step - loss: 1.0960 - accuracy: 0.9218 Epoch 121/200 9/9 [==============================] - 4s 428ms/step - loss: 1.1059 - accuracy: 0.9200 Epoch 122/200 9/9 [==============================] - 4s 428ms/step - loss: 1.0401 - accuracy: 0.9287 Epoch 123/200 9/9 [==============================] - 4s 453ms/step - loss: 1.0396 - accuracy: 0.9141 Epoch 124/200 9/9 [==============================] - 4s 424ms/step - loss: 1.0204 - accuracy: 0.9218 Epoch 125/200 9/9 [==============================] - 4s 446ms/step - loss: 1.0318 - accuracy: 0.9277 Epoch 126/200 9/9 [==============================] - 4s 453ms/step - loss: 0.9985 - accuracy: 0.9449 Epoch 127/200 9/9 [==============================] - 4s 424ms/step - loss: 1.0293 - accuracy: 0.9333 Epoch 128/200 9/9 [==============================] - 4s 459ms/step - loss: 0.9750 - accuracy: 0.9412 Epoch 129/200 9/9 [==============================] - 4s 417ms/step - loss: 0.9805 - accuracy: 0.9547 Epoch 130/200 9/9 [==============================] - 4s 458ms/step - loss: 0.9818 - accuracy: 0.9448 Epoch 131/200 9/9 [==============================] - 4s 415ms/step - loss: 1.0104 - accuracy: 0.9121 Epoch 132/200 9/9 [==============================] - 4s 432ms/step - loss: 0.9630 - accuracy: 0.9289 Epoch 133/200 9/9 [==============================] - 4s 456ms/step - loss: 1.0085 - accuracy: 0.9241 Epoch 134/200 9/9 [==============================] - 4s 424ms/step - loss: 1.0080 - accuracy: 0.9268 Epoch 135/200 9/9 [==============================] - 4s 492ms/step - loss: 1.0220 - accuracy: 0.9141 Epoch 136/200 9/9 [==============================] - 4s 430ms/step - loss: 0.9393 - accuracy: 0.9458 Epoch 137/200 9/9 [==============================] - 4s 454ms/step - loss: 0.9526 - accuracy: 0.9560 Epoch 138/200 9/9 [==============================] - 4s 430ms/step - loss: 0.9706 - accuracy: 0.9427 Epoch 139/200 9/9 [==============================] - 4s 457ms/step - loss: 0.9096 - accuracy: 0.9708 Epoch 140/200 9/9 [==============================] - 4s 435ms/step - loss: 0.9114 - accuracy: 0.9654 Epoch 141/200 9/9 [==============================] - 4s 435ms/step - loss: 0.8628 - accuracy: 0.9686 Epoch 142/200 9/9 [==============================] - 4s 457ms/step - loss: 0.8976 - accuracy: 0.9343 Epoch 143/200 9/9 [==============================] - 4s 431ms/step - loss: 0.8573 - accuracy: 0.9659 Epoch 144/200 9/9 [==============================] - 4s 461ms/step - loss: 0.8449 - accuracy: 0.9541 Epoch 145/200 9/9 [==============================] - 4s 470ms/step - loss: 0.8174 - accuracy: 0.9715 Epoch 146/200 9/9 [==============================] - 4s 442ms/step - loss: 0.8206 - accuracy: 0.9746 Epoch 147/200 9/9 [==============================] - 4s 463ms/step - loss: 0.7986 - accuracy: 0.9811 Epoch 148/200 9/9 [==============================] - 4s 472ms/step - loss: 0.7987 - accuracy: 0.9720 Epoch 149/200 9/9 [==============================] - 4s 461ms/step - loss: 0.7868 - accuracy: 0.9791 Epoch 150/200 9/9 [==============================] - 4s 444ms/step - loss: 0.7779 - accuracy: 0.9762 Epoch 151/200 9/9 [==============================] - 4s 448ms/step - loss: 0.7596 - accuracy: 0.9744 Epoch 152/200 9/9 [==============================] - 4s 450ms/step - loss: 0.7869 - accuracy: 0.9727 Epoch 153/200 9/9 [==============================] - 4s 448ms/step - loss: 0.7634 - accuracy: 0.9844 Epoch 154/200 9/9 [==============================] - 4s 425ms/step - loss: 0.7495 - accuracy: 0.9809 Epoch 155/200 9/9 [==============================] - 4s 455ms/step - loss: 0.7443 - accuracy: 0.9890 Epoch 156/200 9/9 [==============================] - 4s 461ms/step - loss: 0.7348 - accuracy: 0.9810 Epoch 157/200 9/9 [==============================] - 4s 425ms/step - loss: 0.7128 - accuracy: 0.9836 Epoch 158/200 9/9 [==============================] - 4s 468ms/step - loss: 0.7383 - accuracy: 0.9776 Epoch 159/200 9/9 [==============================] - 4s 429ms/step - loss: 0.7139 - accuracy: 0.9721 Epoch 160/200 9/9 [==============================] - 4s 461ms/step - loss: 0.7736 - accuracy: 0.9553 Epoch 161/200 9/9 [==============================] - 4s 423ms/step - loss: 0.7797 - accuracy: 0.9534 Epoch 162/200 9/9 [==============================] - 4s 463ms/step - loss: 0.7694 - accuracy: 0.9757 Epoch 163/200 9/9 [==============================] - 4s 421ms/step - loss: 0.7077 - accuracy: 0.9784 Epoch 164/200 9/9 [==============================] - 4s 459ms/step - loss: 0.7215 - accuracy: 0.9819 Epoch 165/200 9/9 [==============================] - 4s 419ms/step - loss: 0.7166 - accuracy: 0.9908 Epoch 166/200 9/9 [==============================] - 4s 462ms/step - loss: 0.7190 - accuracy: 0.9757 Epoch 167/200 9/9 [==============================] - 4s 418ms/step - loss: 0.6875 - accuracy: 0.9924 Epoch 168/200 9/9 [==============================] - 4s 455ms/step - loss: 0.6888 - accuracy: 0.9816 Epoch 169/200 9/9 [==============================] - 4s 420ms/step - loss: 0.6528 - accuracy: 0.9859 Epoch 170/200 9/9 [==============================] - 4s 450ms/step - loss: 0.6544 - accuracy: 0.9920 Epoch 171/200 9/9 [==============================] - 4s 421ms/step - loss: 0.6369 - accuracy: 0.9917 Epoch 172/200 9/9 [==============================] - 4s 451ms/step - loss: 0.6476 - accuracy: 0.9856 Epoch 173/200 9/9 [==============================] - 4s 466ms/step - loss: 0.6523 - accuracy: 0.9953 Epoch 174/200 9/9 [==============================] - 4s 438ms/step - loss: 0.6200 - accuracy: 0.9923 Epoch 175/200 9/9 [==============================] - 4s 437ms/step - loss: 0.6331 - accuracy: 0.9903 Epoch 176/200 9/9 [==============================] - 4s 454ms/step - loss: 0.6258 - accuracy: 0.9867 Epoch 177/200 9/9 [==============================] - 4s 455ms/step - loss: 0.6106 - accuracy: 0.9826 Epoch 178/200 9/9 [==============================] - 4s 421ms/step - loss: 0.6052 - accuracy: 0.9978 Epoch 179/200 9/9 [==============================] - 4s 455ms/step - loss: 0.6024 - accuracy: 0.9930 Epoch 180/200 9/9 [==============================] - 4s 450ms/step - loss: 0.5871 - accuracy: 0.9893 Epoch 181/200 9/9 [==============================] - 4s 421ms/step - loss: 0.6229 - accuracy: 0.9816 Epoch 182/200 9/9 [==============================] - 4s 459ms/step - loss: 0.6051 - accuracy: 0.9939 Epoch 183/200 9/9 [==============================] - 4s 422ms/step - loss: 0.6083 - accuracy: 0.9973 Epoch 184/200 9/9 [==============================] - 4s 449ms/step - loss: 0.5924 - accuracy: 0.9897 Epoch 185/200 9/9 [==============================] - 4s 430ms/step - loss: 0.5686 - accuracy: 0.9944 Epoch 186/200 9/9 [==============================] - 4s 451ms/step - loss: 0.5687 - accuracy: 0.9954 Epoch 187/200 9/9 [==============================] - 4s 437ms/step - loss: 0.5715 - accuracy: 0.9913 Epoch 188/200 9/9 [==============================] - 4s 452ms/step - loss: 0.5818 - accuracy: 0.9979 Epoch 189/200 9/9 [==============================] - 4s 454ms/step - loss: 0.5616 - accuracy: 0.9989 Epoch 190/200 9/9 [==============================] - 4s 428ms/step - loss: 0.5627 - accuracy: 0.9947 Epoch 191/200 9/9 [==============================] - 4s 455ms/step - loss: 0.5726 - accuracy: 0.9855 Epoch 192/200 9/9 [==============================] - 4s 435ms/step - loss: 0.5652 - accuracy: 0.9943 Epoch 193/200 9/9 [==============================] - 4s 454ms/step - loss: 0.5592 - accuracy: 0.9932 Epoch 194/200 9/9 [==============================] - 4s 450ms/step - loss: 0.5741 - accuracy: 0.9912 Epoch 195/200 9/9 [==============================] - 4s 420ms/step - loss: 0.5391 - accuracy: 1.0000 Epoch 196/200 9/9 [==============================] - 4s 457ms/step - loss: 0.5336 - accuracy: 0.9989 Epoch 197/200 9/9 [==============================] - 4s 461ms/step - loss: 0.5439 - accuracy: 0.9982 Epoch 198/200 9/9 [==============================] - 4s 422ms/step - loss: 0.5817 - accuracy: 0.9827 Epoch 199/200 9/9 [==============================] - 4s 444ms/step - loss: 0.5630 - accuracy: 0.9989 Epoch 200/200 9/9 [==============================] - 4s 441ms/step - loss: 0.5488 - accuracy: 1.0000
import matplotlib.pyplot as plt
acc = history.history['accuracy']
loss = history.history['loss']
epochs = range(len(acc))
plt.plot(epochs, acc, 'b', label = 'Training accuracy')
plt.title('Training accuracy')
plt.figure()
plt.plot(epochs, loss, 'b', label = 'Training Loss')
plt.title('Training loss')
plt.legend()
plt.show()
Once the model has been trained, we can provide a seed text, and ask the model to complete the sentence by generating text after the seed.
seed_text = "She provided him with"
next_words = 10
for _ in range(next_words):
token_list = tokenizer.texts_to_sequences([seed_text])[0]
token_list = pad_sequences([token_list], maxlen=max_sequence_len-1, padding='pre')
predicted = model.predict_classes(token_list, verbose=0)
output_word = ""
for word, index in tokenizer.word_index.items():
if index == predicted:
output_word = word
break
seed_text += " " + output_word
print(seed_text)
She provided him with garret was under the roof of a high five storied
The sentence generated with LSTM is not bad, however probably not better than the one generated by Markov Chain. Let us now modify the LSTM model into a Transformer model by including an Attention layer, and check whether the text generation can be imporved in this case. However, first, we will try to introduce and explain the Attention concept in the next section.
The concept of Attention as well as Transformer model was presented for the first time in 2017 at NEURIPS conference as "Attention is All You Need" article, and later the preprint with more details has been deposited at Arxive.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/attention_all_you_need.png', width = 1000)
The Attention was claimed to be a special operator (or a layer in a neural network) that could figure out hub-words of main importance for each sentence. This allowed to increase the memory of the model and account to longer context in sequential data such as text or biological sequences (for our case of interest).
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/attention_weights.png', width = 1000)
Suppose we have a 3D vector representation (3D word embeddings) of 4 words. Recall from the first section, that words can be converted to vectors of arbitrary dimensions with Embeddings layer of a neural network, or pre-trained Word2Vec model that can also keep some semantic relationships between the words. Below, we will make up 3D vectors, i.e. 3D word embeddings for 4 words:
import numpy as np
X = np.array([[0, 1, 1], [0, 0, 1], [1, 1, 0], [1, 0, 0]])
X
array([[0, 1, 1],
[0, 0, 1],
[1, 1, 0],
[1, 0, 0]])
Attention is a sequence-to-sequence operation $Y = f (X)$: a sequence of vectors $X$ goes in, and a sequence of vectors $Y$ comes out. To produce output vectors $Y$, the attention operation simply takes a weighted average over all the input vectors. Here, however, the weights $W$ are not the parameters for learning but are expressed via the input vectors $X$ themselves. The simplest option for this weights $W$ is the dot product of $X^T$ and $X$, i.e. the covarience of the words. This allows all words interact with all other words.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/word_embeddings.png', width = 1000)
Now, we need to somehow introduce the parameters to be learnt during training. For this purpose, we are going to build three (3 x 3) matrices $W_Q$, $W_K$ and $W_V$, which we then multiply by the input vectors $X$, $X^T$ and $X$, respectively. The resulting products are called queries (Q), keys (K) and values (V) using the Attention terminology. Below, I provide the matrix dimensions in the parentences in order to keep track of final dimensionality of the resulting product. The elements of the Q, K and V matrices are the training parameters to be learnt during training:
$$\Large Q(4x3) = X(4x3) * W_Q(3x3)$$$$\Large K(3x4) = W_K(3x3) * X^T(3x4)$$$$\Large V(4x3) = X(4x3) * W_V(3x3)$$Below, we will assign random float values to $W_Q$, $W_K$ and $W_V$, and perform their multiplcation with the data $X$ in order to obtain queries, keys and values. Please note that the dimensions of $W_Q$, $W_K$ and $W_V$ are (3 x 3), i.e. they are square matrices of the size of embedding vectors of the words data X.
W_Q = np.array([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6], [0.7, 0.8, 0.9]])
W_K = np.array([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6], [0.7, 0.8, 0.9]])
W_V = np.array([[0.1, 0.2, 0.3], [0.4, 0.5, 0.6], [0.7, 0.8, 0.9]])
Q = X @ W_Q
print("Matrix of Queries:")
print(Q)
print("\n")
K = W_K @ X.T
print("Matrix of Keys:")
print(K)
print("\n")
V = X @ W_V
print("Matrix of Values:")
print(V)
print("\n")
Matrix of Queries: [[1.1 1.3 1.5] [0.7 0.8 0.9] [0.5 0.7 0.9] [0.1 0.2 0.3]] Matrix of Keys: [[0.5 0.3 0.3 0.1] [1.1 0.6 0.9 0.4] [1.7 0.9 1.5 0.7]] Matrix of Values: [[1.1 1.3 1.5] [0.7 0.8 0.9] [0.5 0.7 0.9] [0.1 0.2 0.3]]
Now, instead of $X * X^T * X$, after their multiplication by $W_Q$, $W_K$ and $W_V$ (please note that $W_Q$ and $W_V$ multiply $X$ from the right, while $W_K$ multiplies $X^T$ from the left), we will get the product of queries, keys and values as $Q * K * V$. The product of queries and keys $Q * K$ is called attention weights. It contains information about how much the words are related to each other.
from IPython.display import Image
Image('/home/nikolay/Documents/Medium/Attention/attention_matrix_mult.png', width = 1000)
The attention weights $Q * K$ are normalized by softmax operation and have the meaning of probabilities after this tranformation.
from scipy.special import softmax
attention_weights = softmax((Q @ K) / K.shape[1] ** 0.5, axis = 1)
attention_weights
array([[0.43998752, 0.15629518, 0.29789658, 0.10582072],
[0.36520726, 0.19450639, 0.2872822 , 0.15300415],
[0.35470158, 0.19760633, 0.28751551, 0.16017658],
[0.28061018, 0.23438532, 0.26426871, 0.22073578]])
Finally, all we need to do is to multiply the attention weights by the transformed $X$ data matrix, which is the matrix of values $V$. The result is a matrix of the same dimension as the initial $X$ data matrix.
attention = attention_weights @ V
attention
array([[0.75292326, 0.92671167, 1.10050008],
[0.69682397, 0.86207292, 1.02732187],
[0.68827158, 0.85249329, 1.016715 ],
[0.62694886, 0.78143675, 0.93592464]])
Therefore, after we have applied Attention operator to data matrix $X$, we have got a matrix of the same dimensions but slightly modified to take into account the relations between all the words. Also, as the Attention theory suggests, more related words have higher weights which is coded in the final attention matrix.
In this section we will demonstrate how to generate text with an LSTM neural network with added Attention layer.
from tensorflow.keras.preprocessing.sequence import pad_sequences
from tensorflow.keras.layers import Embedding, LSTM, Dense, Dropout, Bidirectional, Attention
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam
from tensorflow.keras import regularizers
import tensorflow.keras.utils as ku
import numpy as np
tokenizer = Tokenizer()
data = open('crime_and_punishment.txt').read()
corpus = data.lower().split("\n")
corpus = corpus[0:5]
tokenizer.fit_on_texts(corpus)
total_words = len(tokenizer.word_index) + 1
corpus
['on an exceptionally hot evening early in july a young man came out of the garret in which he lodged in s. place and walked slowly, as though in hesitation, towards k. bridge.', '', 'he had successfully avoided meeting his landlady on the staircase. his garret was under the roof of a high, five-storied house and was more like a cupboard than a room. the landlady who provided him with garret, dinners, and attendance, lived on the floor below, and every time he went out he was obliged to pass her kitchen, the door of which invariably stood open. and each time he passed, the young man had a sick, frightened feeling, which made him scowl and feel ashamed. he was hopelessly in debt to his landlady, and was afraid of meeting her.', '', 'this was not because he was cowardly and abject, quite the contrary; but for some time past he had been in an overstrained irritable condition, verging on hypochondria. he had become so completely absorbed in himself, and isolated from his fellows that he dreaded meeting, not only his landlady, but anyone at all. he was crushed by poverty, but the anxieties of his position had of late ceased to weigh upon him. he had given up attending to matters of practical importance; he had lost all desire to do so. nothing that any landlady could do had a real terror for him. but to be stopped on the stairs, to be forced to listen to her trivial, irrelevant gossip, to pestering demands for payment, threats and complaints, and to rack his brains for excuses, to prevaricate, to lie—no, rather than that, he would creep down the stairs like a cat and slip out unseen.']
input_sequences = []
for line in corpus:
token_list = tokenizer.texts_to_sequences([line])[0]
for i in range(1, len(token_list)):
n_gram_sequence = token_list[:i+1]
input_sequences.append(n_gram_sequence)
input_sequences[0:10]
[[11, 23], [11, 23, 34], [11, 23, 34, 35], [11, 23, 34, 35, 36], [11, 23, 34, 35, 36, 37], [11, 23, 34, 35, 36, 37, 7], [11, 23, 34, 35, 36, 37, 7, 38], [11, 23, 34, 35, 36, 37, 7, 38, 8], [11, 23, 34, 35, 36, 37, 7, 38, 8, 24], [11, 23, 34, 35, 36, 37, 7, 38, 8, 24, 25]]
max_sequence_len = max([len(x) for x in input_sequences])
input_sequences = np.array(pad_sequences(input_sequences, maxlen=max_sequence_len, padding='pre'))
input_sequences
array([[ 0, 0, 0, ..., 0, 11, 23],
[ 0, 0, 0, ..., 11, 23, 34],
[ 0, 0, 0, ..., 23, 34, 35],
...,
[ 0, 0, 92, ..., 159, 3, 160],
[ 0, 92, 6, ..., 3, 160, 16],
[ 92, 6, 28, ..., 160, 16, 161]], dtype=int32)
predictors, label = input_sequences[:,:-1],input_sequences[:,-1]
label = ku.to_categorical(label, num_classes=total_words)
label
array([[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
...,
[0., 0., 0., ..., 0., 1., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 1.]], dtype=float32)
The architecture of the Transformer neural network is exactly the same as the LSTM model in theprevious section, the only difference is that the attention layer model.add(SeqSelfAttention()) has been added, which brings more fitting parameters.
import keras
from keras_self_attention import SeqSelfAttention
model = Sequential()
model.add(Embedding(total_words, 100, input_length = max_sequence_len-1))
model.add(Bidirectional(LSTM(150, return_sequences = True)))
model.add(SeqSelfAttention())
model.add(Dropout(0.2))
model.add(LSTM(100))
model.add(Dense(total_words / 2, activation = 'relu', kernel_regularizer = regularizers.l2(0.01)))
model.add(Dense(total_words, activation = 'softmax'))
model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.summary()
Using TensorFlow backend.
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= embedding (Embedding) (None, 153, 100) 16200 _________________________________________________________________ bidirectional (Bidirectional (None, 153, 300) 301200 _________________________________________________________________ seq_self_attention (SeqSelfA (None, None, 300) 19265 _________________________________________________________________ dropout (Dropout) (None, None, 300) 0 _________________________________________________________________ lstm_1 (LSTM) (None, 100) 160400 _________________________________________________________________ dense (Dense) (None, 81) 8181 _________________________________________________________________ dense_1 (Dense) (None, 162) 13284 ================================================================= Total params: 518,530 Trainable params: 518,530 Non-trainable params: 0 _________________________________________________________________
history = model.fit(predictors, label, epochs = 1000, verbose = 1)
Epoch 1/1000 9/9 [==============================] - 9s 521ms/step - loss: 5.9452 - accuracy: 0.0196 Epoch 2/1000 9/9 [==============================] - 5s 526ms/step - loss: 5.7212 - accuracy: 0.0528 Epoch 3/1000 9/9 [==============================] - 5s 534ms/step - loss: 5.5079 - accuracy: 0.0656 Epoch 4/1000 9/9 [==============================] - 5s 529ms/step - loss: 5.3363 - accuracy: 0.0466 Epoch 5/1000 9/9 [==============================] - 5s 554ms/step - loss: 5.3265 - accuracy: 0.0470 Epoch 6/1000 9/9 [==============================] - 6s 640ms/step - loss: 5.2778 - accuracy: 0.0472 Epoch 7/1000 9/9 [==============================] - 6s 641ms/step - loss: 5.1659 - accuracy: 0.0612 Epoch 8/1000 9/9 [==============================] - 7s 802ms/step - loss: 5.1070 - accuracy: 0.0485 Epoch 9/1000 9/9 [==============================] - 6s 668ms/step - loss: 5.0407 - accuracy: 0.0632 Epoch 10/1000 9/9 [==============================] - 6s 657ms/step - loss: 4.9475 - accuracy: 0.0789 Epoch 11/1000 9/9 [==============================] - 6s 618ms/step - loss: 4.9793 - accuracy: 0.0598 Epoch 12/1000 9/9 [==============================] - 6s 625ms/step - loss: 4.9587 - accuracy: 0.0628 Epoch 13/1000 9/9 [==============================] - 6s 654ms/step - loss: 4.8874 - accuracy: 0.0719 Epoch 14/1000 9/9 [==============================] - 6s 634ms/step - loss: 4.8295 - accuracy: 0.0672 Epoch 15/1000 9/9 [==============================] - 6s 657ms/step - loss: 4.6849 - accuracy: 0.0772 Epoch 16/1000 9/9 [==============================] - 6s 643ms/step - loss: 4.7042 - accuracy: 0.0702 Epoch 17/1000 9/9 [==============================] - 6s 653ms/step - loss: 4.5336 - accuracy: 0.0934 Epoch 18/1000 9/9 [==============================] - 6s 623ms/step - loss: 4.4975 - accuracy: 0.0704 Epoch 19/1000 9/9 [==============================] - 6s 662ms/step - loss: 4.5880 - accuracy: 0.0573 Epoch 20/1000 9/9 [==============================] - 6s 641ms/step - loss: 4.5078 - accuracy: 0.0768 Epoch 21/1000 9/9 [==============================] - 6s 628ms/step - loss: 4.4320 - accuracy: 0.0959 Epoch 22/1000 9/9 [==============================] - 6s 652ms/step - loss: 4.5093 - accuracy: 0.0678 Epoch 23/1000 9/9 [==============================] - 6s 658ms/step - loss: 4.3577 - accuracy: 0.0783 Epoch 24/1000 9/9 [==============================] - 6s 643ms/step - loss: 4.4156 - accuracy: 0.0794 Epoch 25/1000 9/9 [==============================] - 6s 621ms/step - loss: 4.3252 - accuracy: 0.0618 Epoch 26/1000 9/9 [==============================] - 6s 712ms/step - loss: 4.2579 - accuracy: 0.0727 Epoch 27/1000 9/9 [==============================] - 6s 651ms/step - loss: 4.2944 - accuracy: 0.0678 Epoch 28/1000 9/9 [==============================] - 6s 658ms/step - loss: 4.2626 - accuracy: 0.0678 Epoch 29/1000 9/9 [==============================] - 6s 684ms/step - loss: 4.2364 - accuracy: 0.0836 Epoch 30/1000 9/9 [==============================] - 6s 640ms/step - loss: 4.1539 - accuracy: 0.0921 Epoch 31/1000 9/9 [==============================] - 6s 662ms/step - loss: 4.2371 - accuracy: 0.0757 Epoch 32/1000 9/9 [==============================] - 6s 654ms/step - loss: 4.1517 - accuracy: 0.0500 Epoch 33/1000 9/9 [==============================] - 6s 665ms/step - loss: 4.1648 - accuracy: 0.0706 Epoch 34/1000 9/9 [==============================] - 6s 628ms/step - loss: 4.2065 - accuracy: 0.0808 Epoch 35/1000 9/9 [==============================] - 6s 663ms/step - loss: 4.0977 - accuracy: 0.0800 Epoch 36/1000 9/9 [==============================] - 6s 675ms/step - loss: 4.0636 - accuracy: 0.0853 Epoch 37/1000 9/9 [==============================] - 6s 644ms/step - loss: 4.1346 - accuracy: 0.0636 Epoch 38/1000 9/9 [==============================] - 6s 667ms/step - loss: 4.0778 - accuracy: 0.0745 Epoch 39/1000 9/9 [==============================] - 6s 661ms/step - loss: 4.0522 - accuracy: 0.1082 Epoch 40/1000 9/9 [==============================] - 6s 628ms/step - loss: 4.0580 - accuracy: 0.0733 Epoch 41/1000 9/9 [==============================] - 6s 664ms/step - loss: 4.1428 - accuracy: 0.0673 Epoch 42/1000 9/9 [==============================] - 6s 660ms/step - loss: 4.0763 - accuracy: 0.1024 Epoch 43/1000 9/9 [==============================] - 6s 699ms/step - loss: 4.0393 - accuracy: 0.0932 Epoch 44/1000 9/9 [==============================] - 6s 642ms/step - loss: 4.1270 - accuracy: 0.0898 Epoch 45/1000 9/9 [==============================] - 6s 682ms/step - loss: 4.0167 - accuracy: 0.0977 Epoch 46/1000 9/9 [==============================] - 6s 682ms/step - loss: 4.0369 - accuracy: 0.0836 Epoch 47/1000 9/9 [==============================] - 6s 621ms/step - loss: 4.0669 - accuracy: 0.0838 Epoch 48/1000 9/9 [==============================] - 6s 705ms/step - loss: 4.0561 - accuracy: 0.0810 Epoch 49/1000 9/9 [==============================] - 6s 628ms/step - loss: 3.9401 - accuracy: 0.0894 Epoch 50/1000 9/9 [==============================] - 6s 678ms/step - loss: 4.0221 - accuracy: 0.0601 Epoch 51/1000 9/9 [==============================] - 6s 669ms/step - loss: 3.9681 - accuracy: 0.0826 Epoch 52/1000 9/9 [==============================] - 6s 628ms/step - loss: 3.9112 - accuracy: 0.0650 Epoch 53/1000 9/9 [==============================] - 6s 663ms/step - loss: 4.0085 - accuracy: 0.0809 Epoch 54/1000 9/9 [==============================] - 6s 654ms/step - loss: 3.9461 - accuracy: 0.0760 Epoch 55/1000 9/9 [==============================] - 6s 663ms/step - loss: 3.8958 - accuracy: 0.0906 Epoch 56/1000 9/9 [==============================] - 6s 654ms/step - loss: 4.0342 - accuracy: 0.0587 Epoch 57/1000 9/9 [==============================] - 6s 650ms/step - loss: 4.0209 - accuracy: 0.0621 Epoch 58/1000 9/9 [==============================] - 6s 673ms/step - loss: 3.9704 - accuracy: 0.0964 Epoch 59/1000 9/9 [==============================] - 6s 659ms/step - loss: 3.9026 - accuracy: 0.1152 Epoch 60/1000 9/9 [==============================] - 6s 656ms/step - loss: 3.9125 - accuracy: 0.0601 Epoch 61/1000 9/9 [==============================] - 6s 629ms/step - loss: 3.8785 - accuracy: 0.0883 Epoch 62/1000 9/9 [==============================] - 6s 676ms/step - loss: 3.8519 - accuracy: 0.0950 Epoch 63/1000 9/9 [==============================] - 6s 671ms/step - loss: 3.9288 - accuracy: 0.1036 Epoch 64/1000 9/9 [==============================] - 6s 646ms/step - loss: 3.9414 - accuracy: 0.1166 Epoch 65/1000 9/9 [==============================] - 6s 658ms/step - loss: 3.9486 - accuracy: 0.0962 Epoch 66/1000 9/9 [==============================] - 6s 674ms/step - loss: 3.9448 - accuracy: 0.0841 Epoch 67/1000 9/9 [==============================] - 6s 661ms/step - loss: 3.9382 - accuracy: 0.0971 Epoch 68/1000 9/9 [==============================] - 6s 637ms/step - loss: 3.8736 - accuracy: 0.0920 Epoch 69/1000 9/9 [==============================] - 6s 688ms/step - loss: 3.8482 - accuracy: 0.1103 Epoch 70/1000 9/9 [==============================] - 6s 687ms/step - loss: 3.8614 - accuracy: 0.1056 Epoch 71/1000 9/9 [==============================] - 6s 632ms/step - loss: 3.8558 - accuracy: 0.0952 Epoch 72/1000 9/9 [==============================] - 6s 659ms/step - loss: 3.7990 - accuracy: 0.0989 Epoch 73/1000 9/9 [==============================] - 6s 659ms/step - loss: 3.9183 - accuracy: 0.0796 Epoch 74/1000 9/9 [==============================] - 6s 658ms/step - loss: 3.8101 - accuracy: 0.1129 Epoch 75/1000 9/9 [==============================] - 6s 631ms/step - loss: 3.8740 - accuracy: 0.0825 Epoch 76/1000 9/9 [==============================] - 6s 657ms/step - loss: 3.8410 - accuracy: 0.1004 Epoch 77/1000 9/9 [==============================] - 6s 653ms/step - loss: 3.7973 - accuracy: 0.1124 Epoch 78/1000 9/9 [==============================] - 6s 636ms/step - loss: 3.7788 - accuracy: 0.1063 Epoch 79/1000 9/9 [==============================] - 6s 670ms/step - loss: 3.8382 - accuracy: 0.0824 Epoch 80/1000 9/9 [==============================] - 6s 631ms/step - loss: 3.8882 - accuracy: 0.1006 Epoch 81/1000 9/9 [==============================] - 6s 688ms/step - loss: 3.8187 - accuracy: 0.1042 Epoch 82/1000 9/9 [==============================] - 6s 665ms/step - loss: 3.7622 - accuracy: 0.1166 Epoch 83/1000 9/9 [==============================] - 6s 634ms/step - loss: 3.7639 - accuracy: 0.0989 Epoch 84/1000 9/9 [==============================] - 6s 663ms/step - loss: 3.7273 - accuracy: 0.1255 Epoch 85/1000 9/9 [==============================] - 6s 661ms/step - loss: 3.7406 - accuracy: 0.0832 Epoch 86/1000 9/9 [==============================] - 6s 665ms/step - loss: 3.7862 - accuracy: 0.0990 Epoch 87/1000 9/9 [==============================] - 6s 614ms/step - loss: 3.8641 - accuracy: 0.1102 Epoch 88/1000 9/9 [==============================] - 6s 660ms/step - loss: 3.8853 - accuracy: 0.0880 Epoch 89/1000 9/9 [==============================] - 6s 699ms/step - loss: 3.8346 - accuracy: 0.0941 Epoch 90/1000 9/9 [==============================] - 6s 636ms/step - loss: 3.7646 - accuracy: 0.1001 Epoch 91/1000 9/9 [==============================] - 6s 656ms/step - loss: 3.7372 - accuracy: 0.0765 Epoch 92/1000 9/9 [==============================] - 7s 738ms/step - loss: 3.7747 - accuracy: 0.0864 Epoch 93/1000 9/9 [==============================] - 6s 622ms/step - loss: 3.7311 - accuracy: 0.0816 Epoch 94/1000 9/9 [==============================] - 6s 652ms/step - loss: 3.6409 - accuracy: 0.1093 Epoch 95/1000 9/9 [==============================] - 6s 650ms/step - loss: 3.7780 - accuracy: 0.1079 Epoch 96/1000 9/9 [==============================] - 6s 667ms/step - loss: 3.6982 - accuracy: 0.0987 Epoch 97/1000 9/9 [==============================] - 6s 644ms/step - loss: 3.7419 - accuracy: 0.1115 Epoch 98/1000 9/9 [==============================] - 6s 649ms/step - loss: 3.6872 - accuracy: 0.1043 Epoch 99/1000 9/9 [==============================] - 6s 654ms/step - loss: 3.6412 - accuracy: 0.1196 Epoch 100/1000 9/9 [==============================] - 6s 656ms/step - loss: 3.8031 - accuracy: 0.0876 Epoch 101/1000 9/9 [==============================] - 6s 661ms/step - loss: 3.8303 - accuracy: 0.0818 Epoch 102/1000 9/9 [==============================] - 6s 628ms/step - loss: 3.7950 - accuracy: 0.1160 Epoch 103/1000 9/9 [==============================] - 6s 709ms/step - loss: 3.8816 - accuracy: 0.1078 Epoch 104/1000 9/9 [==============================] - 6s 625ms/step - loss: 3.8035 - accuracy: 0.1074 Epoch 105/1000 9/9 [==============================] - 7s 760ms/step - loss: 3.6789 - accuracy: 0.1443 Epoch 106/1000 9/9 [==============================] - 6s 650ms/step - loss: 3.7963 - accuracy: 0.1097 Epoch 107/1000 9/9 [==============================] - 6s 635ms/step - loss: 3.5770 - accuracy: 0.1226 Epoch 108/1000 9/9 [==============================] - 6s 689ms/step - loss: 3.6854 - accuracy: 0.1292 Epoch 109/1000 9/9 [==============================] - 6s 700ms/step - loss: 3.7020 - accuracy: 0.1364 Epoch 110/1000 9/9 [==============================] - 6s 657ms/step - loss: 3.6230 - accuracy: 0.1245 Epoch 111/1000 9/9 [==============================] - 6s 657ms/step - loss: 3.5371 - accuracy: 0.1592 Epoch 112/1000 9/9 [==============================] - 6s 673ms/step - loss: 3.6299 - accuracy: 0.1741 Epoch 113/1000 9/9 [==============================] - 6s 663ms/step - loss: 3.5805 - accuracy: 0.1770 Epoch 114/1000 9/9 [==============================] - 6s 617ms/step - loss: 3.7001 - accuracy: 0.1328 Epoch 115/1000 9/9 [==============================] - 6s 703ms/step - loss: 3.6058 - accuracy: 0.1395 Epoch 116/1000 9/9 [==============================] - 6s 647ms/step - loss: 3.6300 - accuracy: 0.1368 Epoch 117/1000 9/9 [==============================] - 7s 759ms/step - loss: 3.6051 - accuracy: 0.1543 Epoch 118/1000 9/9 [==============================] - 6s 674ms/step - loss: 3.5482 - accuracy: 0.1781 Epoch 119/1000 9/9 [==============================] - 6s 684ms/step - loss: 3.5300 - accuracy: 0.1570 Epoch 120/1000 9/9 [==============================] - 6s 657ms/step - loss: 3.5072 - accuracy: 0.1849 Epoch 121/1000 9/9 [==============================] - 6s 633ms/step - loss: 3.5338 - accuracy: 0.1553 Epoch 122/1000 9/9 [==============================] - 6s 696ms/step - loss: 3.5332 - accuracy: 0.1408 Epoch 123/1000 9/9 [==============================] - 6s 620ms/step - loss: 3.5353 - accuracy: 0.1728 Epoch 124/1000 9/9 [==============================] - 6s 650ms/step - loss: 3.4883 - accuracy: 0.1612 Epoch 125/1000 9/9 [==============================] - 6s 652ms/step - loss: 3.3867 - accuracy: 0.1797 Epoch 126/1000 9/9 [==============================] - 6s 645ms/step - loss: 3.4875 - accuracy: 0.1734 Epoch 127/1000 9/9 [==============================] - 6s 651ms/step - loss: 3.4077 - accuracy: 0.1759 Epoch 128/1000 9/9 [==============================] - 6s 654ms/step - loss: 3.4502 - accuracy: 0.1725 Epoch 129/1000 9/9 [==============================] - 6s 650ms/step - loss: 3.4026 - accuracy: 0.1825 Epoch 130/1000 9/9 [==============================] - 6s 656ms/step - loss: 3.4065 - accuracy: 0.1924 Epoch 131/1000 9/9 [==============================] - 6s 619ms/step - loss: 3.4002 - accuracy: 0.1851 Epoch 132/1000 9/9 [==============================] - 6s 647ms/step - loss: 3.3809 - accuracy: 0.2083 Epoch 133/1000 9/9 [==============================] - 6s 637ms/step - loss: 3.2965 - accuracy: 0.2193 Epoch 134/1000 9/9 [==============================] - 6s 689ms/step - loss: 3.3698 - accuracy: 0.2089 Epoch 135/1000 9/9 [==============================] - 6s 622ms/step - loss: 3.3352 - accuracy: 0.2089 Epoch 136/1000 9/9 [==============================] - 6s 673ms/step - loss: 3.2558 - accuracy: 0.2318 Epoch 137/1000 9/9 [==============================] - 6s 686ms/step - loss: 3.3333 - accuracy: 0.1813 Epoch 138/1000 9/9 [==============================] - 6s 623ms/step - loss: 3.2813 - accuracy: 0.1910 Epoch 139/1000 9/9 [==============================] - 6s 663ms/step - loss: 3.2863 - accuracy: 0.2060 Epoch 140/1000 9/9 [==============================] - 6s 656ms/step - loss: 3.2515 - accuracy: 0.2494 Epoch 141/1000 9/9 [==============================] - 6s 648ms/step - loss: 3.1858 - accuracy: 0.2992 Epoch 142/1000 9/9 [==============================] - 6s 655ms/step - loss: 3.2806 - accuracy: 0.2294 Epoch 143/1000 9/9 [==============================] - 6s 649ms/step - loss: 3.2828 - accuracy: 0.2173 Epoch 144/1000 9/9 [==============================] - 6s 618ms/step - loss: 3.2776 - accuracy: 0.1826 Epoch 145/1000 9/9 [==============================] - 6s 662ms/step - loss: 3.2427 - accuracy: 0.2247 Epoch 146/1000 9/9 [==============================] - 6s 649ms/step - loss: 3.2384 - accuracy: 0.1979 Epoch 147/1000 9/9 [==============================] - 6s 652ms/step - loss: 3.2491 - accuracy: 0.2296 Epoch 148/1000 9/9 [==============================] - 6s 646ms/step - loss: 3.1979 - accuracy: 0.2560 Epoch 149/1000 9/9 [==============================] - 6s 648ms/step - loss: 3.1628 - accuracy: 0.2581 Epoch 150/1000 9/9 [==============================] - 6s 631ms/step - loss: 3.2008 - accuracy: 0.2539 Epoch 151/1000 9/9 [==============================] - 6s 705ms/step - loss: 3.1043 - accuracy: 0.3091 Epoch 152/1000 9/9 [==============================] - 6s 641ms/step - loss: 3.1340 - accuracy: 0.2587 Epoch 153/1000 9/9 [==============================] - 6s 645ms/step - loss: 3.1521 - accuracy: 0.2325 Epoch 154/1000 9/9 [==============================] - 6s 656ms/step - loss: 2.9975 - accuracy: 0.2644 Epoch 155/1000 9/9 [==============================] - 6s 624ms/step - loss: 3.0419 - accuracy: 0.2992 Epoch 156/1000 9/9 [==============================] - 6s 697ms/step - loss: 3.0772 - accuracy: 0.2771 Epoch 157/1000 9/9 [==============================] - 6s 653ms/step - loss: 3.0708 - accuracy: 0.2552 Epoch 158/1000 9/9 [==============================] - 6s 653ms/step - loss: 3.0342 - accuracy: 0.2844 Epoch 159/1000 9/9 [==============================] - 6s 665ms/step - loss: 3.0598 - accuracy: 0.2771 Epoch 160/1000 9/9 [==============================] - 6s 625ms/step - loss: 3.0309 - accuracy: 0.2729 Epoch 161/1000 9/9 [==============================] - 6s 678ms/step - loss: 3.0205 - accuracy: 0.3025 Epoch 162/1000 9/9 [==============================] - 6s 686ms/step - loss: 2.9986 - accuracy: 0.3254 Epoch 163/1000 9/9 [==============================] - 6s 689ms/step - loss: 2.9969 - accuracy: 0.3035 Epoch 164/1000 9/9 [==============================] - 6s 621ms/step - loss: 2.9527 - accuracy: 0.2915 Epoch 165/1000 9/9 [==============================] - 6s 690ms/step - loss: 2.9751 - accuracy: 0.3054 Epoch 166/1000 9/9 [==============================] - 6s 644ms/step - loss: 2.9327 - accuracy: 0.3148 Epoch 167/1000 9/9 [==============================] - 6s 671ms/step - loss: 2.9178 - accuracy: 0.3032 Epoch 168/1000 9/9 [==============================] - 6s 643ms/step - loss: 3.0462 - accuracy: 0.3088 Epoch 169/1000 9/9 [==============================] - 6s 652ms/step - loss: 3.0232 - accuracy: 0.2644 Epoch 170/1000 9/9 [==============================] - 6s 635ms/step - loss: 2.9592 - accuracy: 0.3016 Epoch 171/1000 9/9 [==============================] - 6s 696ms/step - loss: 2.9552 - accuracy: 0.2878 Epoch 172/1000 9/9 [==============================] - 6s 629ms/step - loss: 3.0415 - accuracy: 0.2277 Epoch 173/1000 9/9 [==============================] - 6s 673ms/step - loss: 2.9035 - accuracy: 0.3307 Epoch 174/1000 9/9 [==============================] - 6s 651ms/step - loss: 3.0009 - accuracy: 0.3133 Epoch 175/1000 9/9 [==============================] - 6s 654ms/step - loss: 3.0309 - accuracy: 0.2758 Epoch 176/1000 9/9 [==============================] - 6s 650ms/step - loss: 2.8960 - accuracy: 0.3049 Epoch 177/1000 9/9 [==============================] - 6s 659ms/step - loss: 3.0277 - accuracy: 0.2740 Epoch 178/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.9275 - accuracy: 0.2878 Epoch 179/1000 9/9 [==============================] - 6s 644ms/step - loss: 2.9571 - accuracy: 0.2763 Epoch 180/1000 9/9 [==============================] - 6s 614ms/step - loss: 3.0137 - accuracy: 0.2589 Epoch 181/1000 9/9 [==============================] - 6s 677ms/step - loss: 3.0276 - accuracy: 0.2840 Epoch 182/1000 9/9 [==============================] - 6s 708ms/step - loss: 2.9374 - accuracy: 0.3189 Epoch 183/1000 9/9 [==============================] - 6s 622ms/step - loss: 2.9772 - accuracy: 0.2814 Epoch 184/1000 9/9 [==============================] - 6s 697ms/step - loss: 2.9695 - accuracy: 0.2876 Epoch 185/1000 9/9 [==============================] - 6s 678ms/step - loss: 2.8590 - accuracy: 0.3350 Epoch 186/1000 9/9 [==============================] - 6s 629ms/step - loss: 2.9778 - accuracy: 0.2979 Epoch 187/1000 9/9 [==============================] - 6s 703ms/step - loss: 2.8389 - accuracy: 0.2943 Epoch 188/1000 9/9 [==============================] - 6s 635ms/step - loss: 2.9494 - accuracy: 0.3165 Epoch 189/1000 9/9 [==============================] - 6s 658ms/step - loss: 3.0557 - accuracy: 0.2792 Epoch 190/1000 9/9 [==============================] - 6s 645ms/step - loss: 2.9141 - accuracy: 0.2686 Epoch 191/1000 9/9 [==============================] - 6s 651ms/step - loss: 3.0161 - accuracy: 0.2635 Epoch 192/1000 9/9 [==============================] - 6s 654ms/step - loss: 3.1166 - accuracy: 0.2748 Epoch 193/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.9569 - accuracy: 0.2679 Epoch 194/1000 9/9 [==============================] - 6s 632ms/step - loss: 2.8488 - accuracy: 0.3005 Epoch 195/1000 9/9 [==============================] - 6s 647ms/step - loss: 2.8649 - accuracy: 0.3467 Epoch 196/1000 9/9 [==============================] - 6s 649ms/step - loss: 2.9205 - accuracy: 0.3174 Epoch 197/1000 9/9 [==============================] - 6s 650ms/step - loss: 2.7943 - accuracy: 0.3658 Epoch 198/1000 9/9 [==============================] - 6s 657ms/step - loss: 2.8728 - accuracy: 0.3491 Epoch 199/1000 9/9 [==============================] - 6s 649ms/step - loss: 2.7652 - accuracy: 0.3479 Epoch 200/1000 9/9 [==============================] - 6s 681ms/step - loss: 2.7361 - accuracy: 0.3582 Epoch 201/1000 9/9 [==============================] - 6s 621ms/step - loss: 2.8120 - accuracy: 0.3392 Epoch 202/1000 9/9 [==============================] - 6s 646ms/step - loss: 2.8056 - accuracy: 0.3401 Epoch 203/1000 9/9 [==============================] - 6s 656ms/step - loss: 2.7640 - accuracy: 0.3471 Epoch 204/1000 9/9 [==============================] - 6s 655ms/step - loss: 2.6845 - accuracy: 0.3640 Epoch 205/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.7413 - accuracy: 0.3949 Epoch 206/1000 9/9 [==============================] - 6s 666ms/step - loss: 2.8274 - accuracy: 0.3856 Epoch 207/1000 9/9 [==============================] - 6s 627ms/step - loss: 2.6453 - accuracy: 0.4401 Epoch 208/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.7259 - accuracy: 0.3964 Epoch 209/1000 9/9 [==============================] - 6s 649ms/step - loss: 2.6795 - accuracy: 0.4007 Epoch 210/1000 9/9 [==============================] - 6s 658ms/step - loss: 2.6709 - accuracy: 0.3460 Epoch 211/1000 9/9 [==============================] - 6s 623ms/step - loss: 2.6589 - accuracy: 0.3887 Epoch 212/1000 9/9 [==============================] - 6s 685ms/step - loss: 2.6677 - accuracy: 0.3601 Epoch 213/1000 9/9 [==============================] - 6s 689ms/step - loss: 2.5031 - accuracy: 0.4625 Epoch 214/1000 9/9 [==============================] - 6s 641ms/step - loss: 2.5606 - accuracy: 0.4289 Epoch 215/1000 9/9 [==============================] - 6s 656ms/step - loss: 2.6584 - accuracy: 0.4015 Epoch 216/1000 9/9 [==============================] - 6s 656ms/step - loss: 2.5345 - accuracy: 0.4378 Epoch 217/1000 9/9 [==============================] - 6s 655ms/step - loss: 2.5707 - accuracy: 0.4505 Epoch 218/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.6615 - accuracy: 0.3948 Epoch 219/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.6071 - accuracy: 0.4006 Epoch 220/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.6121 - accuracy: 0.4163 Epoch 221/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.6277 - accuracy: 0.3908 Epoch 222/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.6470 - accuracy: 0.3986 Epoch 223/1000 9/9 [==============================] - 6s 623ms/step - loss: 2.6104 - accuracy: 0.4267 Epoch 224/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.5205 - accuracy: 0.4249 Epoch 225/1000 9/9 [==============================] - 6s 650ms/step - loss: 2.5106 - accuracy: 0.4402 Epoch 226/1000 9/9 [==============================] - 6s 642ms/step - loss: 2.5445 - accuracy: 0.4416 Epoch 227/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.5169 - accuracy: 0.4252 Epoch 228/1000 9/9 [==============================] - 7s 736ms/step - loss: 2.4581 - accuracy: 0.4692 Epoch 229/1000 9/9 [==============================] - 6s 615ms/step - loss: 2.6026 - accuracy: 0.3892 Epoch 230/1000 9/9 [==============================] - 6s 651ms/step - loss: 2.4171 - accuracy: 0.4750 Epoch 231/1000 9/9 [==============================] - 6s 651ms/step - loss: 2.4802 - accuracy: 0.4168 Epoch 232/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.4674 - accuracy: 0.4547 Epoch 233/1000 9/9 [==============================] - 6s 646ms/step - loss: 2.4330 - accuracy: 0.4509 Epoch 234/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.4375 - accuracy: 0.4343 Epoch 235/1000 9/9 [==============================] - 6s 649ms/step - loss: 2.3739 - accuracy: 0.4720 Epoch 236/1000 9/9 [==============================] - 6s 638ms/step - loss: 2.5080 - accuracy: 0.4390 Epoch 237/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.5569 - accuracy: 0.3713 Epoch 238/1000 9/9 [==============================] - 6s 659ms/step - loss: 2.4602 - accuracy: 0.4188 Epoch 239/1000 9/9 [==============================] - 6s 655ms/step - loss: 2.3759 - accuracy: 0.4773 Epoch 240/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.4773 - accuracy: 0.4280 Epoch 241/1000 9/9 [==============================] - 6s 627ms/step - loss: 2.3824 - accuracy: 0.4133 Epoch 242/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.5378 - accuracy: 0.3992 Epoch 243/1000 9/9 [==============================] - 6s 657ms/step - loss: 2.3831 - accuracy: 0.4390 Epoch 244/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.3381 - accuracy: 0.4864 Epoch 245/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.5432 - accuracy: 0.3997 Epoch 246/1000 9/9 [==============================] - 6s 645ms/step - loss: 2.3615 - accuracy: 0.4588 Epoch 247/1000 9/9 [==============================] - 6s 639ms/step - loss: 2.3040 - accuracy: 0.4760 Epoch 248/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.2703 - accuracy: 0.4825 Epoch 249/1000 9/9 [==============================] - 6s 656ms/step - loss: 2.3966 - accuracy: 0.4337 Epoch 250/1000 9/9 [==============================] - 6s 646ms/step - loss: 2.3843 - accuracy: 0.4913 Epoch 251/1000 9/9 [==============================] - 6s 618ms/step - loss: 2.3960 - accuracy: 0.4838 Epoch 252/1000 9/9 [==============================] - 6s 669ms/step - loss: 2.3341 - accuracy: 0.4782 Epoch 253/1000 9/9 [==============================] - 6s 691ms/step - loss: 2.2564 - accuracy: 0.4934 Epoch 254/1000 9/9 [==============================] - 6s 623ms/step - loss: 2.2872 - accuracy: 0.4865 Epoch 255/1000 9/9 [==============================] - 6s 697ms/step - loss: 2.2259 - accuracy: 0.5105 Epoch 256/1000 9/9 [==============================] - 6s 670ms/step - loss: 2.2209 - accuracy: 0.5516 Epoch 257/1000 9/9 [==============================] - 6s 618ms/step - loss: 2.2812 - accuracy: 0.5120 Epoch 258/1000 9/9 [==============================] - 6s 672ms/step - loss: 2.2391 - accuracy: 0.5286 Epoch 259/1000 9/9 [==============================] - 6s 690ms/step - loss: 2.2640 - accuracy: 0.4788 Epoch 260/1000 9/9 [==============================] - 6s 658ms/step - loss: 2.2536 - accuracy: 0.5307 Epoch 261/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.2860 - accuracy: 0.5154 Epoch 262/1000 9/9 [==============================] - 6s 651ms/step - loss: 2.2055 - accuracy: 0.5153 Epoch 263/1000 9/9 [==============================] - 6s 644ms/step - loss: 2.2430 - accuracy: 0.4965 Epoch 264/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.2075 - accuracy: 0.5099 Epoch 265/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.2489 - accuracy: 0.5296 Epoch 266/1000 9/9 [==============================] - 6s 649ms/step - loss: 2.1654 - accuracy: 0.5181 Epoch 267/1000 9/9 [==============================] - 6s 633ms/step - loss: 2.1766 - accuracy: 0.5192 Epoch 268/1000 9/9 [==============================] - 6s 681ms/step - loss: 2.1223 - accuracy: 0.5611 Epoch 269/1000 9/9 [==============================] - 6s 639ms/step - loss: 2.1552 - accuracy: 0.5219 Epoch 270/1000 9/9 [==============================] - 6s 655ms/step - loss: 2.1763 - accuracy: 0.5045 Epoch 271/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.0870 - accuracy: 0.5361 Epoch 272/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.1932 - accuracy: 0.5161 Epoch 273/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.1499 - accuracy: 0.5112 Epoch 274/1000 9/9 [==============================] - 6s 652ms/step - loss: 2.1623 - accuracy: 0.5302 Epoch 275/1000 9/9 [==============================] - 6s 659ms/step - loss: 2.0299 - accuracy: 0.5562 Epoch 276/1000 9/9 [==============================] - 7s 786ms/step - loss: 2.0796 - accuracy: 0.5767 Epoch 277/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.0952 - accuracy: 0.5519 Epoch 278/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.1047 - accuracy: 0.5568 Epoch 279/1000 9/9 [==============================] - 6s 645ms/step - loss: 2.0264 - accuracy: 0.5905 Epoch 280/1000 9/9 [==============================] - 6s 647ms/step - loss: 2.0797 - accuracy: 0.5543 Epoch 281/1000 9/9 [==============================] - 6s 634ms/step - loss: 2.0353 - accuracy: 0.5947 Epoch 282/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.9882 - accuracy: 0.5801 Epoch 283/1000 9/9 [==============================] - 6s 646ms/step - loss: 2.0540 - accuracy: 0.5491 Epoch 284/1000 9/9 [==============================] - 6s 662ms/step - loss: 2.0046 - accuracy: 0.5579 Epoch 285/1000 9/9 [==============================] - 6s 669ms/step - loss: 2.0820 - accuracy: 0.5633 Epoch 286/1000 9/9 [==============================] - 6s 627ms/step - loss: 2.0551 - accuracy: 0.5718 Epoch 287/1000 9/9 [==============================] - 6s 677ms/step - loss: 2.0252 - accuracy: 0.5681 Epoch 288/1000 9/9 [==============================] - 6s 654ms/step - loss: 2.0187 - accuracy: 0.5750 Epoch 289/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.9928 - accuracy: 0.6079 Epoch 290/1000 9/9 [==============================] - 6s 665ms/step - loss: 2.0011 - accuracy: 0.5776 Epoch 291/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.9456 - accuracy: 0.6308 Epoch 292/1000 9/9 [==============================] - 6s 653ms/step - loss: 2.0407 - accuracy: 0.5461 Epoch 293/1000 9/9 [==============================] - 6s 681ms/step - loss: 1.9890 - accuracy: 0.5908 Epoch 294/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.0111 - accuracy: 0.5810 Epoch 295/1000 9/9 [==============================] - 6s 659ms/step - loss: 1.9478 - accuracy: 0.5988 Epoch 296/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.9762 - accuracy: 0.5906 Epoch 297/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.9258 - accuracy: 0.6194 Epoch 298/1000 9/9 [==============================] - 6s 653ms/step - loss: 1.9300 - accuracy: 0.6071 Epoch 299/1000 9/9 [==============================] - 6s 617ms/step - loss: 1.8566 - accuracy: 0.6432 Epoch 300/1000 9/9 [==============================] - 6s 655ms/step - loss: 2.0251 - accuracy: 0.5930 Epoch 301/1000 9/9 [==============================] - 6s 648ms/step - loss: 2.0428 - accuracy: 0.5526 Epoch 302/1000 9/9 [==============================] - 6s 659ms/step - loss: 1.9185 - accuracy: 0.6262 Epoch 303/1000 9/9 [==============================] - 6s 666ms/step - loss: 1.9196 - accuracy: 0.5804 Epoch 304/1000 9/9 [==============================] - 6s 623ms/step - loss: 1.9625 - accuracy: 0.5726 Epoch 305/1000 9/9 [==============================] - 6s 672ms/step - loss: 1.8854 - accuracy: 0.6197 Epoch 306/1000 9/9 [==============================] - 6s 700ms/step - loss: 1.8923 - accuracy: 0.5753 Epoch 307/1000 9/9 [==============================] - 6s 644ms/step - loss: 1.8631 - accuracy: 0.6282 Epoch 308/1000 9/9 [==============================] - 6s 668ms/step - loss: 1.9282 - accuracy: 0.5910 Epoch 309/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.8699 - accuracy: 0.6028 Epoch 310/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.8828 - accuracy: 0.6184 Epoch 311/1000 9/9 [==============================] - 6s 657ms/step - loss: 1.9114 - accuracy: 0.5910 Epoch 312/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.8140 - accuracy: 0.6347 Epoch 313/1000 9/9 [==============================] - 6s 636ms/step - loss: 1.9277 - accuracy: 0.5906 Epoch 314/1000 9/9 [==============================] - 6s 658ms/step - loss: 1.8788 - accuracy: 0.6067 Epoch 315/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.8886 - accuracy: 0.5832 Epoch 316/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.8783 - accuracy: 0.5756 Epoch 317/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.8470 - accuracy: 0.6596 Epoch 318/1000 9/9 [==============================] - 6s 634ms/step - loss: 1.9369 - accuracy: 0.5596 Epoch 319/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.8819 - accuracy: 0.6032 Epoch 320/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.9241 - accuracy: 0.5608 Epoch 321/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.9024 - accuracy: 0.5803 Epoch 322/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.8425 - accuracy: 0.6121 Epoch 323/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.7886 - accuracy: 0.6129 Epoch 324/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.8117 - accuracy: 0.6432 Epoch 325/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.9190 - accuracy: 0.5739 Epoch 326/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.7509 - accuracy: 0.6550 Epoch 327/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.7055 - accuracy: 0.6337 Epoch 328/1000 9/9 [==============================] - 6s 662ms/step - loss: 1.7179 - accuracy: 0.6865 Epoch 329/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.7588 - accuracy: 0.6553 Epoch 330/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.6750 - accuracy: 0.6628 Epoch 331/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.6973 - accuracy: 0.6536 Epoch 332/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.7578 - accuracy: 0.6341 Epoch 333/1000 9/9 [==============================] - 6s 617ms/step - loss: 1.6997 - accuracy: 0.6362 Epoch 334/1000 9/9 [==============================] - 6s 653ms/step - loss: 1.5942 - accuracy: 0.6834 Epoch 335/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.7638 - accuracy: 0.6203 Epoch 336/1000 9/9 [==============================] - 6s 626ms/step - loss: 1.7117 - accuracy: 0.6357 Epoch 337/1000 9/9 [==============================] - 6s 688ms/step - loss: 1.7015 - accuracy: 0.6467 Epoch 338/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.6621 - accuracy: 0.6788 Epoch 339/1000 9/9 [==============================] - 6s 645ms/step - loss: 1.8087 - accuracy: 0.6053 Epoch 340/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.7257 - accuracy: 0.6348 Epoch 341/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.7615 - accuracy: 0.6036 Epoch 342/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.7484 - accuracy: 0.6304 Epoch 343/1000 9/9 [==============================] - 6s 633ms/step - loss: 1.6994 - accuracy: 0.6575 Epoch 344/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.7884 - accuracy: 0.5934 Epoch 345/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.7050 - accuracy: 0.6208 Epoch 346/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.7890 - accuracy: 0.5963 Epoch 347/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.7420 - accuracy: 0.6082 Epoch 348/1000 9/9 [==============================] - 6s 634ms/step - loss: 1.6958 - accuracy: 0.6078 Epoch 349/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.7224 - accuracy: 0.6526 Epoch 350/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.6211 - accuracy: 0.6885 Epoch 351/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.5820 - accuracy: 0.6794 Epoch 352/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.6496 - accuracy: 0.6399 Epoch 353/1000 9/9 [==============================] - 6s 624ms/step - loss: 1.6059 - accuracy: 0.6678 Epoch 354/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.5994 - accuracy: 0.6740 Epoch 355/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.6729 - accuracy: 0.6523 Epoch 356/1000 9/9 [==============================] - 6s 657ms/step - loss: 1.5756 - accuracy: 0.6841 Epoch 357/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.5859 - accuracy: 0.6812 Epoch 358/1000 9/9 [==============================] - 6s 623ms/step - loss: 1.5386 - accuracy: 0.6720 Epoch 359/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.5696 - accuracy: 0.6851 Epoch 360/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.5316 - accuracy: 0.7214 Epoch 361/1000 9/9 [==============================] - 6s 633ms/step - loss: 1.5689 - accuracy: 0.7016 Epoch 362/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.5335 - accuracy: 0.6566 Epoch 363/1000 9/9 [==============================] - 6s 657ms/step - loss: 1.5100 - accuracy: 0.7058 Epoch 364/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.5807 - accuracy: 0.6968 Epoch 365/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.5072 - accuracy: 0.7130 Epoch 366/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.6080 - accuracy: 0.6532 Epoch 367/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.5273 - accuracy: 0.6898 Epoch 368/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.5204 - accuracy: 0.6930 Epoch 369/1000 9/9 [==============================] - 6s 643ms/step - loss: 1.4795 - accuracy: 0.7313 Epoch 370/1000 9/9 [==============================] - 6s 659ms/step - loss: 1.5564 - accuracy: 0.6755 Epoch 371/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.4546 - accuracy: 0.7233 Epoch 372/1000 9/9 [==============================] - 6s 632ms/step - loss: 1.3723 - accuracy: 0.7880 Epoch 373/1000 9/9 [==============================] - 6s 669ms/step - loss: 1.3951 - accuracy: 0.7570 Epoch 374/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.4266 - accuracy: 0.7382 Epoch 375/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.4949 - accuracy: 0.7423 Epoch 376/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.4119 - accuracy: 0.7753 Epoch 377/1000 9/9 [==============================] - 6s 637ms/step - loss: 1.4050 - accuracy: 0.7443 Epoch 378/1000 9/9 [==============================] - 6s 666ms/step - loss: 1.4211 - accuracy: 0.7421 Epoch 379/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.4094 - accuracy: 0.7509 Epoch 380/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.4443 - accuracy: 0.7252 Epoch 381/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.3957 - accuracy: 0.7620 Epoch 382/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.3787 - accuracy: 0.7533 Epoch 383/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.4085 - accuracy: 0.7557 Epoch 384/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.3967 - accuracy: 0.7492 Epoch 385/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.3528 - accuracy: 0.7713 Epoch 386/1000 9/9 [==============================] - 6s 631ms/step - loss: 1.3686 - accuracy: 0.7843 Epoch 387/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.3933 - accuracy: 0.7656 Epoch 388/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.3388 - accuracy: 0.7618 Epoch 389/1000 9/9 [==============================] - 6s 639ms/step - loss: 1.4094 - accuracy: 0.7432 Epoch 390/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.3825 - accuracy: 0.7741 Epoch 391/1000 9/9 [==============================] - 6s 659ms/step - loss: 1.3096 - accuracy: 0.7711 Epoch 392/1000 9/9 [==============================] - 6s 657ms/step - loss: 1.3342 - accuracy: 0.7342 Epoch 393/1000 9/9 [==============================] - 6s 622ms/step - loss: 1.3589 - accuracy: 0.7633 Epoch 394/1000 9/9 [==============================] - 6s 683ms/step - loss: 1.4357 - accuracy: 0.7349 Epoch 395/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.3397 - accuracy: 0.7757 Epoch 396/1000 9/9 [==============================] - 6s 654ms/step - loss: 1.4024 - accuracy: 0.7203 Epoch 397/1000 9/9 [==============================] - 6s 653ms/step - loss: 1.3434 - accuracy: 0.7707 Epoch 398/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.3152 - accuracy: 0.7757 Epoch 399/1000 9/9 [==============================] - 6s 617ms/step - loss: 1.4302 - accuracy: 0.7407 Epoch 400/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.2834 - accuracy: 0.8099 Epoch 401/1000 9/9 [==============================] - 6s 626ms/step - loss: 1.3427 - accuracy: 0.7642 Epoch 402/1000 9/9 [==============================] - 6s 707ms/step - loss: 1.2452 - accuracy: 0.7831 Epoch 403/1000 9/9 [==============================] - 6s 638ms/step - loss: 1.2982 - accuracy: 0.8046 Epoch 404/1000 9/9 [==============================] - 6s 658ms/step - loss: 1.2411 - accuracy: 0.8342 Epoch 405/1000 9/9 [==============================] - 6s 653ms/step - loss: 1.2337 - accuracy: 0.8267 Epoch 406/1000 9/9 [==============================] - 6s 629ms/step - loss: 1.2691 - accuracy: 0.7974 Epoch 407/1000 9/9 [==============================] - 6s 696ms/step - loss: 1.2903 - accuracy: 0.7643 Epoch 408/1000 9/9 [==============================] - 6s 617ms/step - loss: 1.2946 - accuracy: 0.7755 Epoch 409/1000 9/9 [==============================] - 6s 671ms/step - loss: 1.3058 - accuracy: 0.7935 Epoch 410/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.2061 - accuracy: 0.8568 Epoch 411/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.1819 - accuracy: 0.8285 Epoch 412/1000 9/9 [==============================] - 6s 689ms/step - loss: 1.2252 - accuracy: 0.8038 Epoch 413/1000 9/9 [==============================] - 6s 635ms/step - loss: 1.1818 - accuracy: 0.8329 Epoch 414/1000 9/9 [==============================] - 6s 676ms/step - loss: 1.2076 - accuracy: 0.8190 Epoch 415/1000 9/9 [==============================] - 6s 694ms/step - loss: 1.2671 - accuracy: 0.8203 Epoch 416/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.2220 - accuracy: 0.8561 Epoch 417/1000 9/9 [==============================] - 6s 671ms/step - loss: 1.2089 - accuracy: 0.8131 Epoch 418/1000 9/9 [==============================] - 6s 695ms/step - loss: 1.1838 - accuracy: 0.8241 Epoch 419/1000 9/9 [==============================] - 6s 618ms/step - loss: 1.1653 - accuracy: 0.8398 Epoch 420/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.1893 - accuracy: 0.8335 Epoch 421/1000 9/9 [==============================] - 7s 775ms/step - loss: 1.1490 - accuracy: 0.8419 Epoch 422/1000 9/9 [==============================] - 6s 706ms/step - loss: 1.1517 - accuracy: 0.8265 Epoch 423/1000 9/9 [==============================] - 6s 662ms/step - loss: 1.1654 - accuracy: 0.8378 Epoch 424/1000 9/9 [==============================] - 6s 635ms/step - loss: 1.1846 - accuracy: 0.8327 Epoch 425/1000 9/9 [==============================] - 6s 669ms/step - loss: 1.1494 - accuracy: 0.8619 Epoch 426/1000 9/9 [==============================] - 6s 694ms/step - loss: 1.2415 - accuracy: 0.7808 Epoch 427/1000 9/9 [==============================] - 6s 614ms/step - loss: 1.2502 - accuracy: 0.8175 Epoch 428/1000 9/9 [==============================] - 6s 668ms/step - loss: 1.1679 - accuracy: 0.8434 Epoch 429/1000 9/9 [==============================] - 6s 705ms/step - loss: 1.2323 - accuracy: 0.7936 Epoch 430/1000 9/9 [==============================] - 6s 663ms/step - loss: 1.2044 - accuracy: 0.8336 Epoch 431/1000 9/9 [==============================] - 6s 630ms/step - loss: 1.2600 - accuracy: 0.8059 Epoch 432/1000 9/9 [==============================] - 6s 671ms/step - loss: 1.7168 - accuracy: 0.6735 Epoch 433/1000 9/9 [==============================] - 6s 699ms/step - loss: 2.2611 - accuracy: 0.4606 Epoch 434/1000 9/9 [==============================] - 6s 634ms/step - loss: 2.1183 - accuracy: 0.4870 Epoch 435/1000 9/9 [==============================] - 6s 696ms/step - loss: 1.9533 - accuracy: 0.5118 Epoch 436/1000 9/9 [==============================] - 6s 677ms/step - loss: 1.8288 - accuracy: 0.5486 Epoch 437/1000 9/9 [==============================] - 6s 627ms/step - loss: 1.8972 - accuracy: 0.6107 Epoch 438/1000 9/9 [==============================] - 6s 645ms/step - loss: 1.6341 - accuracy: 0.6025 Epoch 439/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.4424 - accuracy: 0.8063 Epoch 440/1000 9/9 [==============================] - 6s 647ms/step - loss: 1.4079 - accuracy: 0.7852 Epoch 441/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.3344 - accuracy: 0.8097 Epoch 442/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.3296 - accuracy: 0.8144 Epoch 443/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.3067 - accuracy: 0.8008 Epoch 444/1000 9/9 [==============================] - 6s 657ms/step - loss: 1.2876 - accuracy: 0.8034 Epoch 445/1000 9/9 [==============================] - 6s 659ms/step - loss: 1.2408 - accuracy: 0.8271 Epoch 446/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.2114 - accuracy: 0.8265 Epoch 447/1000 9/9 [==============================] - 6s 662ms/step - loss: 1.2242 - accuracy: 0.8233 Epoch 448/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.1616 - accuracy: 0.8652 Epoch 449/1000 9/9 [==============================] - 6s 622ms/step - loss: 1.1076 - accuracy: 0.8528 Epoch 450/1000 9/9 [==============================] - 6s 685ms/step - loss: 1.1240 - accuracy: 0.8571 Epoch 451/1000 9/9 [==============================] - 6s 702ms/step - loss: 1.0989 - accuracy: 0.8582 Epoch 452/1000 9/9 [==============================] - 6s 630ms/step - loss: 1.0913 - accuracy: 0.8625 Epoch 453/1000 9/9 [==============================] - 6s 671ms/step - loss: 1.1613 - accuracy: 0.8159 Epoch 454/1000 9/9 [==============================] - 6s 690ms/step - loss: 1.1040 - accuracy: 0.8577 Epoch 455/1000 9/9 [==============================] - 6s 663ms/step - loss: 1.1643 - accuracy: 0.8266 Epoch 456/1000 9/9 [==============================] - 6s 677ms/step - loss: 1.1772 - accuracy: 0.8478 Epoch 457/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.1616 - accuracy: 0.8377 Epoch 458/1000 9/9 [==============================] - 6s 653ms/step - loss: 1.0649 - accuracy: 0.8653 Epoch 459/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.0690 - accuracy: 0.8738 Epoch 460/1000 9/9 [==============================] - 6s 635ms/step - loss: 1.0741 - accuracy: 0.8876 Epoch 461/1000 9/9 [==============================] - 6s 648ms/step - loss: 1.0712 - accuracy: 0.8772 Epoch 462/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.9684 - accuracy: 0.9144 Epoch 463/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.9837 - accuracy: 0.9090 Epoch 464/1000 9/9 [==============================] - 6s 649ms/step - loss: 1.0085 - accuracy: 0.8760 Epoch 465/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.0086 - accuracy: 0.8917 Epoch 466/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.9922 - accuracy: 0.9028 Epoch 467/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.9785 - accuracy: 0.9062 Epoch 468/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.9948 - accuracy: 0.8941 Epoch 469/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.9842 - accuracy: 0.8993 Epoch 470/1000 9/9 [==============================] - 6s 642ms/step - loss: 0.9435 - accuracy: 0.9208 Epoch 471/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.0048 - accuracy: 0.9050 Epoch 472/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.9731 - accuracy: 0.8895 Epoch 473/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.9851 - accuracy: 0.9122 Epoch 474/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.9544 - accuracy: 0.8970 Epoch 475/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.9525 - accuracy: 0.9233 Epoch 476/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.9407 - accuracy: 0.9043 Epoch 477/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.9240 - accuracy: 0.8935 Epoch 478/1000 9/9 [==============================] - 6s 626ms/step - loss: 0.9282 - accuracy: 0.9137 Epoch 479/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.9793 - accuracy: 0.8802 Epoch 480/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.9682 - accuracy: 0.9120 Epoch 481/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.9492 - accuracy: 0.9016 Epoch 482/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.9634 - accuracy: 0.9127 Epoch 483/1000 9/9 [==============================] - 6s 625ms/step - loss: 0.9126 - accuracy: 0.9405 Epoch 484/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.9561 - accuracy: 0.8937 Epoch 485/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.8885 - accuracy: 0.9312 Epoch 486/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.9511 - accuracy: 0.8961 Epoch 487/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.9227 - accuracy: 0.9173 Epoch 488/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.8681 - accuracy: 0.9364 Epoch 489/1000 9/9 [==============================] - 6s 629ms/step - loss: 0.8664 - accuracy: 0.9279 Epoch 490/1000 9/9 [==============================] - 6s 670ms/step - loss: 0.8799 - accuracy: 0.9229 Epoch 491/1000 9/9 [==============================] - 6s 620ms/step - loss: 0.8964 - accuracy: 0.9353 Epoch 492/1000 9/9 [==============================] - 7s 732ms/step - loss: 0.8675 - accuracy: 0.9456 Epoch 493/1000 9/9 [==============================] - 6s 635ms/step - loss: 0.8190 - accuracy: 0.9374 Epoch 494/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.8810 - accuracy: 0.9346 Epoch 495/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.8234 - accuracy: 0.9554 Epoch 496/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.8306 - accuracy: 0.9329 Epoch 497/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.8100 - accuracy: 0.9528 Epoch 498/1000 9/9 [==============================] - 6s 703ms/step - loss: 0.8117 - accuracy: 0.9459 Epoch 499/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.8306 - accuracy: 0.9514 Epoch 500/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.8062 - accuracy: 0.9532 Epoch 501/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.8510 - accuracy: 0.9337 Epoch 502/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.8265 - accuracy: 0.9396 Epoch 503/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.8281 - accuracy: 0.9331 Epoch 504/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.7957 - accuracy: 0.9460 Epoch 505/1000 9/9 [==============================] - 6s 631ms/step - loss: 0.8256 - accuracy: 0.9408 Epoch 506/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.8351 - accuracy: 0.9283 Epoch 507/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.8275 - accuracy: 0.9372 Epoch 508/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.8000 - accuracy: 0.9511 Epoch 509/1000 9/9 [==============================] - 6s 626ms/step - loss: 0.7859 - accuracy: 0.9592 Epoch 510/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.7758 - accuracy: 0.9352 Epoch 511/1000 9/9 [==============================] - 6s 672ms/step - loss: 0.8046 - accuracy: 0.9543 Epoch 512/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.7844 - accuracy: 0.9314 Epoch 513/1000 9/9 [==============================] - 6s 614ms/step - loss: 0.7685 - accuracy: 0.9561 Epoch 514/1000 9/9 [==============================] - 7s 758ms/step - loss: 0.7655 - accuracy: 0.9511 Epoch 515/1000 9/9 [==============================] - 6s 693ms/step - loss: 0.7419 - accuracy: 0.9499 Epoch 516/1000 9/9 [==============================] - 6s 679ms/step - loss: 0.7761 - accuracy: 0.9420 Epoch 517/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.8085 - accuracy: 0.9382 Epoch 518/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.7679 - accuracy: 0.9476 Epoch 519/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.7736 - accuracy: 0.9464 Epoch 520/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.8315 - accuracy: 0.9455 Epoch 521/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.8667 - accuracy: 0.9368 Epoch 522/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.0059 - accuracy: 0.9011 Epoch 523/1000 9/9 [==============================] - 6s 630ms/step - loss: 1.0949 - accuracy: 0.8467 Epoch 524/1000 9/9 [==============================] - 6s 698ms/step - loss: 1.0194 - accuracy: 0.8676 Epoch 525/1000 9/9 [==============================] - 6s 641ms/step - loss: 0.8775 - accuracy: 0.9457 Epoch 526/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.8574 - accuracy: 0.9223 Epoch 527/1000 9/9 [==============================] - 6s 642ms/step - loss: 0.8556 - accuracy: 0.9256 Epoch 528/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.8136 - accuracy: 0.9336 Epoch 529/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.7606 - accuracy: 0.9650 Epoch 530/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.8235 - accuracy: 0.9230 Epoch 531/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.7661 - accuracy: 0.9543 Epoch 532/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.7758 - accuracy: 0.9490 Epoch 533/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.7675 - accuracy: 0.9536 Epoch 534/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.7423 - accuracy: 0.9555 Epoch 535/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.7279 - accuracy: 0.9766 Epoch 536/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.7446 - accuracy: 0.9694 Epoch 537/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.7412 - accuracy: 0.9578 Epoch 538/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.7031 - accuracy: 0.9590 Epoch 539/1000 9/9 [==============================] - 6s 618ms/step - loss: 0.7229 - accuracy: 0.9621 Epoch 540/1000 9/9 [==============================] - 6s 663ms/step - loss: 1.0619 - accuracy: 0.8534 Epoch 541/1000 9/9 [==============================] - 6s 655ms/step - loss: 1.1397 - accuracy: 0.8680 Epoch 542/1000 9/9 [==============================] - 6s 622ms/step - loss: 1.1273 - accuracy: 0.8798 Epoch 543/1000 9/9 [==============================] - 6s 662ms/step - loss: 1.0389 - accuracy: 0.8917 Epoch 544/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.9293 - accuracy: 0.9232 Epoch 545/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.8975 - accuracy: 0.9097 Epoch 546/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.9231 - accuracy: 0.8625 Epoch 547/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.8943 - accuracy: 0.8918 Epoch 548/1000 9/9 [==============================] - 6s 701ms/step - loss: 0.8590 - accuracy: 0.9157 Epoch 549/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.8319 - accuracy: 0.9565 Epoch 550/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.8179 - accuracy: 0.9498 Epoch 551/1000 9/9 [==============================] - 6s 663ms/step - loss: 0.7945 - accuracy: 0.9501 Epoch 552/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.7657 - accuracy: 0.9513 Epoch 553/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.7653 - accuracy: 0.9547 Epoch 554/1000 9/9 [==============================] - 6s 660ms/step - loss: 0.7077 - accuracy: 0.9553 Epoch 555/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.7204 - accuracy: 0.9537 Epoch 556/1000 9/9 [==============================] - 6s 664ms/step - loss: 0.7247 - accuracy: 0.9552 Epoch 557/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.6863 - accuracy: 0.9682 Epoch 558/1000 9/9 [==============================] - 6s 640ms/step - loss: 0.6877 - accuracy: 0.9614 Epoch 559/1000 9/9 [==============================] - 6s 688ms/step - loss: 0.6783 - accuracy: 0.9467 Epoch 560/1000 9/9 [==============================] - 6s 623ms/step - loss: 0.6848 - accuracy: 0.9802 Epoch 561/1000 9/9 [==============================] - 6s 691ms/step - loss: 0.6812 - accuracy: 0.9732 Epoch 562/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.7062 - accuracy: 0.9727 Epoch 563/1000 9/9 [==============================] - 6s 666ms/step - loss: 0.6565 - accuracy: 0.9715 Epoch 564/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.6618 - accuracy: 0.9825 Epoch 565/1000 9/9 [==============================] - 6s 663ms/step - loss: 0.6516 - accuracy: 0.9719 Epoch 566/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.6423 - accuracy: 0.9766 Epoch 567/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.6687 - accuracy: 0.9564 Epoch 568/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.6356 - accuracy: 0.9667 Epoch 569/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.6535 - accuracy: 0.9696 Epoch 570/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.6340 - accuracy: 0.9777 Epoch 571/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.6646 - accuracy: 0.9619 Epoch 572/1000 9/9 [==============================] - 6s 632ms/step - loss: 0.6185 - accuracy: 0.9624 Epoch 573/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.5963 - accuracy: 0.9827 Epoch 574/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.6379 - accuracy: 0.9613 Epoch 575/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.6184 - accuracy: 0.9765 Epoch 576/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.6432 - accuracy: 0.9579 Epoch 577/1000 9/9 [==============================] - 6s 626ms/step - loss: 0.5771 - accuracy: 0.9824 Epoch 578/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.6187 - accuracy: 0.9800 Epoch 579/1000 9/9 [==============================] - 7s 822ms/step - loss: 0.6612 - accuracy: 0.9668 Epoch 580/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.6977 - accuracy: 0.9611 Epoch 581/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.6674 - accuracy: 0.9445 Epoch 582/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.6028 - accuracy: 0.9824 Epoch 583/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.6281 - accuracy: 0.9676 Epoch 584/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.6371 - accuracy: 0.9676 Epoch 585/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.6392 - accuracy: 0.9458 Epoch 586/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.6269 - accuracy: 0.9606 Epoch 587/1000 9/9 [==============================] - 6s 638ms/step - loss: 0.5956 - accuracy: 0.9810 Epoch 588/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.5915 - accuracy: 0.9700 Epoch 589/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.5904 - accuracy: 0.9581 Epoch 590/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.5661 - accuracy: 0.9741 Epoch 591/1000 9/9 [==============================] - 6s 636ms/step - loss: 0.5908 - accuracy: 0.9683 Epoch 592/1000 9/9 [==============================] - 6s 686ms/step - loss: 0.5742 - accuracy: 0.9845 Epoch 593/1000 9/9 [==============================] - 6s 696ms/step - loss: 0.5804 - accuracy: 0.9953 Epoch 594/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.5664 - accuracy: 0.9744 Epoch 595/1000 9/9 [==============================] - 6s 659ms/step - loss: 0.6030 - accuracy: 0.9618 Epoch 596/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.5654 - accuracy: 0.9722 Epoch 597/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.5657 - accuracy: 0.9828 Epoch 598/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.5648 - accuracy: 0.9739 Epoch 599/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.5902 - accuracy: 0.9627 Epoch 600/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.5602 - accuracy: 0.9802 Epoch 601/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.5756 - accuracy: 0.9836 Epoch 602/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.5319 - accuracy: 0.9819 Epoch 603/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.5478 - accuracy: 0.9866 Epoch 604/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.5539 - accuracy: 0.9927 Epoch 605/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.5416 - accuracy: 0.9877 Epoch 606/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.5488 - accuracy: 0.9681 Epoch 607/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.5571 - accuracy: 0.9802 Epoch 608/1000 9/9 [==============================] - 6s 627ms/step - loss: 0.5417 - accuracy: 0.9770 Epoch 609/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.5373 - accuracy: 0.9745 Epoch 610/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.5320 - accuracy: 0.9801 Epoch 611/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.5433 - accuracy: 0.9830 Epoch 612/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.5374 - accuracy: 0.9798 Epoch 613/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.5314 - accuracy: 0.9939 Epoch 614/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.5666 - accuracy: 0.9727 Epoch 615/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.5485 - accuracy: 0.9772 Epoch 616/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.5368 - accuracy: 0.9840 Epoch 617/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.5538 - accuracy: 0.9566 Epoch 618/1000 9/9 [==============================] - 6s 620ms/step - loss: 0.5341 - accuracy: 0.9794 Epoch 619/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.5268 - accuracy: 0.9897 Epoch 620/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.5358 - accuracy: 0.9889 Epoch 621/1000 9/9 [==============================] - 6s 639ms/step - loss: 0.5296 - accuracy: 0.9898 Epoch 622/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.5309 - accuracy: 0.9733 Epoch 623/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.5627 - accuracy: 0.9699 Epoch 624/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.5504 - accuracy: 0.9800 Epoch 625/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.5504 - accuracy: 0.9741 Epoch 626/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.5220 - accuracy: 0.9832 Epoch 627/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.5183 - accuracy: 0.9947 Epoch 628/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.5372 - accuracy: 0.9934 Epoch 629/1000 9/9 [==============================] - 6s 625ms/step - loss: 0.5157 - accuracy: 0.9901 Epoch 630/1000 9/9 [==============================] - 6s 687ms/step - loss: 0.5212 - accuracy: 0.9887 Epoch 631/1000 9/9 [==============================] - 6s 700ms/step - loss: 0.5129 - accuracy: 0.9913 Epoch 632/1000 9/9 [==============================] - 6s 629ms/step - loss: 0.4978 - accuracy: 0.9848 Epoch 633/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.5163 - accuracy: 0.9780 Epoch 634/1000 9/9 [==============================] - 6s 702ms/step - loss: 0.4894 - accuracy: 0.9885 Epoch 635/1000 9/9 [==============================] - 6s 659ms/step - loss: 0.4856 - accuracy: 0.9966 Epoch 636/1000 9/9 [==============================] - 6s 632ms/step - loss: 0.4993 - accuracy: 0.9775 Epoch 637/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.4816 - accuracy: 0.9876 Epoch 638/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.4917 - accuracy: 0.9863 Epoch 639/1000 9/9 [==============================] - 6s 633ms/step - loss: 0.4830 - accuracy: 0.9832 Epoch 640/1000 9/9 [==============================] - 6s 686ms/step - loss: 0.4695 - accuracy: 0.9830 Epoch 641/1000 9/9 [==============================] - 6s 663ms/step - loss: 0.4792 - accuracy: 0.9937 Epoch 642/1000 9/9 [==============================] - 6s 661ms/step - loss: 0.4945 - accuracy: 0.9863 Epoch 643/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4668 - accuracy: 0.9902 Epoch 644/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.4868 - accuracy: 0.9829 Epoch 645/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.4721 - accuracy: 0.9908 Epoch 646/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.4883 - accuracy: 0.9986 Epoch 647/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.4750 - accuracy: 0.9800 Epoch 648/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.4645 - accuracy: 0.9939 Epoch 649/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.4842 - accuracy: 0.9735 Epoch 650/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.4677 - accuracy: 0.9948 Epoch 651/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4786 - accuracy: 0.9912 Epoch 652/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.4517 - accuracy: 0.9930 Epoch 653/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.4758 - accuracy: 0.9863 Epoch 654/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4781 - accuracy: 0.9912 Epoch 655/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.4594 - accuracy: 0.9879 Epoch 656/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.4618 - accuracy: 0.9993 Epoch 657/1000 9/9 [==============================] - 6s 699ms/step - loss: 0.4623 - accuracy: 0.9921 Epoch 658/1000 9/9 [==============================] - 6s 618ms/step - loss: 0.4563 - accuracy: 0.9894 Epoch 659/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4421 - accuracy: 0.9903 Epoch 660/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4710 - accuracy: 0.9944 Epoch 661/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.4319 - accuracy: 0.9957 Epoch 662/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.4663 - accuracy: 0.9832 Epoch 663/1000 9/9 [==============================] - 6s 671ms/step - loss: 0.4529 - accuracy: 0.9895 Epoch 664/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.4466 - accuracy: 0.9916 Epoch 665/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4476 - accuracy: 0.9953 Epoch 666/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4428 - accuracy: 0.9892 Epoch 667/1000 9/9 [==============================] - 6s 659ms/step - loss: 0.4374 - accuracy: 0.9918 Epoch 668/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4405 - accuracy: 0.9894 Epoch 669/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4382 - accuracy: 0.9989 Epoch 670/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4455 - accuracy: 0.9928 Epoch 671/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.4520 - accuracy: 0.9897 Epoch 672/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.4396 - accuracy: 0.9953 Epoch 673/1000 9/9 [==============================] - 6s 634ms/step - loss: 0.4185 - accuracy: 0.9954 Epoch 674/1000 9/9 [==============================] - 6s 664ms/step - loss: 0.4177 - accuracy: 0.9965 Epoch 675/1000 9/9 [==============================] - 6s 632ms/step - loss: 0.4218 - accuracy: 0.9962 Epoch 676/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.4291 - accuracy: 0.9969 Epoch 677/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.4218 - accuracy: 0.9958 Epoch 678/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4164 - accuracy: 1.0000 Epoch 679/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.4057 - accuracy: 0.9989 Epoch 680/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4592 - accuracy: 0.9837 Epoch 681/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.4356 - accuracy: 0.9945 Epoch 682/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.4135 - accuracy: 0.9913 Epoch 683/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.4369 - accuracy: 0.9898 Epoch 684/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.4292 - accuracy: 0.9913 Epoch 685/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.4438 - accuracy: 0.9827 Epoch 686/1000 9/9 [==============================] - 6s 659ms/step - loss: 0.4629 - accuracy: 0.9762 Epoch 687/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.4471 - accuracy: 0.9800 Epoch 688/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.4354 - accuracy: 0.9893 Epoch 689/1000 9/9 [==============================] - 6s 642ms/step - loss: 0.4242 - accuracy: 0.9909 Epoch 690/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.4536 - accuracy: 0.9921 Epoch 691/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.4355 - accuracy: 0.9935 Epoch 692/1000 9/9 [==============================] - 6s 620ms/step - loss: 0.4408 - accuracy: 0.9897 Epoch 693/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.4637 - accuracy: 0.9985 Epoch 694/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.4530 - accuracy: 0.9831 Epoch 695/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.4317 - accuracy: 0.9945 Epoch 696/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.4383 - accuracy: 0.9867 Epoch 697/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4174 - accuracy: 0.9937 Epoch 698/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.4163 - accuracy: 0.9978 Epoch 699/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.4147 - accuracy: 0.9985 Epoch 700/1000 9/9 [==============================] - 6s 636ms/step - loss: 0.4037 - accuracy: 0.9901 Epoch 701/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.3965 - accuracy: 0.9961 Epoch 702/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.3998 - accuracy: 0.9955 Epoch 703/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.4100 - accuracy: 0.9939 Epoch 704/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.3846 - accuracy: 0.9919 Epoch 705/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3880 - accuracy: 0.9931 Epoch 706/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3881 - accuracy: 0.9919 Epoch 707/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.3958 - accuracy: 0.9931 Epoch 708/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.4132 - accuracy: 0.9908 Epoch 709/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4048 - accuracy: 0.9887 Epoch 710/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3993 - accuracy: 1.0000 Epoch 711/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3957 - accuracy: 0.9985 Epoch 712/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.3877 - accuracy: 0.9979 Epoch 713/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3864 - accuracy: 1.0000 Epoch 714/1000 9/9 [==============================] - 6s 669ms/step - loss: 0.3945 - accuracy: 1.0000 Epoch 715/1000 9/9 [==============================] - 6s 622ms/step - loss: 0.3929 - accuracy: 0.9955 Epoch 716/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3775 - accuracy: 0.9989 Epoch 717/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.3910 - accuracy: 0.9973 Epoch 718/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.3836 - accuracy: 0.9924 Epoch 719/1000 9/9 [==============================] - 6s 622ms/step - loss: 0.3719 - accuracy: 0.9971 Epoch 720/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3698 - accuracy: 0.9989 Epoch 721/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3712 - accuracy: 0.9939 Epoch 722/1000 9/9 [==============================] - 6s 627ms/step - loss: 0.3694 - accuracy: 0.9964 Epoch 723/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.3670 - accuracy: 0.9965 Epoch 724/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3634 - accuracy: 1.0000 Epoch 725/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.3511 - accuracy: 1.0000 Epoch 726/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3668 - accuracy: 1.0000 Epoch 727/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.3562 - accuracy: 1.0000 Epoch 728/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.3683 - accuracy: 1.0000 Epoch 729/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.3604 - accuracy: 1.0000 Epoch 730/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.3837 - accuracy: 0.9967 Epoch 731/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3917 - accuracy: 0.9989 Epoch 732/1000 9/9 [==============================] - 6s 620ms/step - loss: 0.3653 - accuracy: 0.9993 Epoch 733/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3748 - accuracy: 0.9929 Epoch 734/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.4037 - accuracy: 0.9852 Epoch 735/1000 9/9 [==============================] - 6s 671ms/step - loss: 0.3873 - accuracy: 0.9982 Epoch 736/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.3796 - accuracy: 0.9908 Epoch 737/1000 9/9 [==============================] - 6s 640ms/step - loss: 0.3640 - accuracy: 0.9943 Epoch 738/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3556 - accuracy: 0.9973 Epoch 739/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3558 - accuracy: 1.0000 Epoch 740/1000 9/9 [==============================] - 6s 617ms/step - loss: 0.3784 - accuracy: 0.9932 Epoch 741/1000 9/9 [==============================] - 6s 698ms/step - loss: 0.3548 - accuracy: 0.9979 Epoch 742/1000 9/9 [==============================] - 6s 634ms/step - loss: 0.3856 - accuracy: 0.9874 Epoch 743/1000 9/9 [==============================] - 6s 722ms/step - loss: 0.3615 - accuracy: 0.9908 Epoch 744/1000 9/9 [==============================] - 6s 643ms/step - loss: 0.3484 - accuracy: 0.9974 Epoch 745/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.3616 - accuracy: 1.0000 Epoch 746/1000 9/9 [==============================] - 6s 618ms/step - loss: 0.3478 - accuracy: 0.9985 Epoch 747/1000 9/9 [==============================] - 6s 695ms/step - loss: 0.3483 - accuracy: 1.0000 Epoch 748/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.3437 - accuracy: 0.9938 Epoch 749/1000 9/9 [==============================] - 6s 670ms/step - loss: 0.3496 - accuracy: 1.0000 Epoch 750/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.3419 - accuracy: 0.9962 Epoch 751/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3610 - accuracy: 1.0000 Epoch 752/1000 9/9 [==============================] - 6s 663ms/step - loss: 0.3391 - accuracy: 0.9985 Epoch 753/1000 9/9 [==============================] - 6s 641ms/step - loss: 0.3388 - accuracy: 0.9973 Epoch 754/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.3617 - accuracy: 0.9973 Epoch 755/1000 9/9 [==============================] - 6s 718ms/step - loss: 0.3520 - accuracy: 1.0000 Epoch 756/1000 9/9 [==============================] - 6s 630ms/step - loss: 0.3326 - accuracy: 0.9989 Epoch 757/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.3301 - accuracy: 1.0000 Epoch 758/1000 9/9 [==============================] - 6s 695ms/step - loss: 0.3408 - accuracy: 1.0000 Epoch 759/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3394 - accuracy: 1.0000 Epoch 760/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3329 - accuracy: 0.9939 Epoch 761/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3341 - accuracy: 0.9993 Epoch 762/1000 9/9 [==============================] - 6s 670ms/step - loss: 0.3352 - accuracy: 0.9973 Epoch 763/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3441 - accuracy: 1.0000 Epoch 764/1000 9/9 [==============================] - 6s 643ms/step - loss: 0.3364 - accuracy: 1.0000 Epoch 765/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3372 - accuracy: 1.0000 Epoch 766/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.3403 - accuracy: 1.0000 Epoch 767/1000 9/9 [==============================] - 6s 637ms/step - loss: 0.3349 - accuracy: 1.0000 Epoch 768/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.3368 - accuracy: 1.0000 Epoch 769/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3282 - accuracy: 1.0000 Epoch 770/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.3377 - accuracy: 1.0000 Epoch 771/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3270 - accuracy: 1.0000 Epoch 772/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3264 - accuracy: 1.0000 Epoch 773/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.3246 - accuracy: 0.9965 Epoch 774/1000 9/9 [==============================] - 6s 630ms/step - loss: 0.3244 - accuracy: 1.0000 Epoch 775/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.3251 - accuracy: 1.0000 Epoch 776/1000 9/9 [==============================] - 6s 666ms/step - loss: 0.3156 - accuracy: 1.0000 Epoch 777/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3174 - accuracy: 1.0000 Epoch 778/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3245 - accuracy: 1.0000 Epoch 779/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.3263 - accuracy: 1.0000 Epoch 780/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.3499 - accuracy: 0.9947 Epoch 781/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4949 - accuracy: 0.9418 Epoch 782/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.6433 - accuracy: 0.9005 Epoch 783/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.8268 - accuracy: 0.8656 Epoch 784/1000 9/9 [==============================] - 6s 625ms/step - loss: 1.0765 - accuracy: 0.7769 Epoch 785/1000 9/9 [==============================] - 6s 675ms/step - loss: 0.9672 - accuracy: 0.8028 Epoch 786/1000 9/9 [==============================] - 6s 695ms/step - loss: 0.9330 - accuracy: 0.8375 Epoch 787/1000 9/9 [==============================] - 6s 630ms/step - loss: 1.1116 - accuracy: 0.8081 Epoch 788/1000 9/9 [==============================] - 6s 656ms/step - loss: 1.0521 - accuracy: 0.7710 Epoch 789/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.0065 - accuracy: 0.7881 Epoch 790/1000 9/9 [==============================] - 6s 650ms/step - loss: 1.3987 - accuracy: 0.7215 Epoch 791/1000 9/9 [==============================] - 6s 652ms/step - loss: 1.7473 - accuracy: 0.6091 Epoch 792/1000 9/9 [==============================] - 6s 646ms/step - loss: 1.3952 - accuracy: 0.7115 Epoch 793/1000 9/9 [==============================] - 6s 651ms/step - loss: 1.2263 - accuracy: 0.7785 Epoch 794/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.8509 - accuracy: 0.8611 Epoch 795/1000 9/9 [==============================] - 6s 630ms/step - loss: 0.7130 - accuracy: 0.9176 Epoch 796/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.6457 - accuracy: 0.9182 Epoch 797/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.5753 - accuracy: 0.9581 Epoch 798/1000 9/9 [==============================] - 6s 627ms/step - loss: 0.5498 - accuracy: 0.9399 Epoch 799/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.5382 - accuracy: 0.9521 Epoch 800/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.5017 - accuracy: 0.9683 Epoch 801/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.4762 - accuracy: 0.9724 Epoch 802/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4732 - accuracy: 0.9775 Epoch 803/1000 9/9 [==============================] - 6s 627ms/step - loss: 0.4324 - accuracy: 0.9852 Epoch 804/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.4620 - accuracy: 0.9792 Epoch 805/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.4709 - accuracy: 0.9695 Epoch 806/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.4535 - accuracy: 0.9710 Epoch 807/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.4290 - accuracy: 0.9887 Epoch 808/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4329 - accuracy: 0.9822 Epoch 809/1000 9/9 [==============================] - 6s 615ms/step - loss: 0.4307 - accuracy: 0.9685 Epoch 810/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.4214 - accuracy: 0.9951 Epoch 811/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.4115 - accuracy: 0.9955 Epoch 812/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.4249 - accuracy: 0.9791 Epoch 813/1000 9/9 [==============================] - 6s 637ms/step - loss: 0.4234 - accuracy: 0.9768 Epoch 814/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.4083 - accuracy: 0.9843 Epoch 815/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3818 - accuracy: 0.9957 Epoch 816/1000 9/9 [==============================] - 6s 635ms/step - loss: 0.3860 - accuracy: 0.9947 Epoch 817/1000 9/9 [==============================] - 6s 712ms/step - loss: 0.4157 - accuracy: 0.9846 Epoch 818/1000 9/9 [==============================] - 6s 622ms/step - loss: 0.3785 - accuracy: 1.0000 Epoch 819/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3700 - accuracy: 0.9951 Epoch 820/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.3815 - accuracy: 0.9954 Epoch 821/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.3828 - accuracy: 0.9982 Epoch 822/1000 9/9 [==============================] - 6s 617ms/step - loss: 0.3783 - accuracy: 0.9953 Epoch 823/1000 9/9 [==============================] - 6s 700ms/step - loss: 0.3809 - accuracy: 1.0000 Epoch 824/1000 9/9 [==============================] - 6s 634ms/step - loss: 0.3904 - accuracy: 0.9873 Epoch 825/1000 9/9 [==============================] - 6s 593ms/step - loss: 0.3700 - accuracy: 0.9993 Epoch 826/1000 9/9 [==============================] - 6s 642ms/step - loss: 0.3710 - accuracy: 0.9973 Epoch 827/1000 9/9 [==============================] - 6s 684ms/step - loss: 0.3681 - accuracy: 0.9908 Epoch 828/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3619 - accuracy: 0.9964 Epoch 829/1000 9/9 [==============================] - 6s 634ms/step - loss: 0.3526 - accuracy: 1.0000 Epoch 830/1000 9/9 [==============================] - 6s 714ms/step - loss: 0.3574 - accuracy: 0.9978 Epoch 831/1000 9/9 [==============================] - 6s 697ms/step - loss: 0.3713 - accuracy: 1.0000 Epoch 832/1000 9/9 [==============================] - 6s 622ms/step - loss: 0.3585 - accuracy: 1.0000 Epoch 833/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.3608 - accuracy: 0.9964 Epoch 834/1000 9/9 [==============================] - 6s 681ms/step - loss: 0.3690 - accuracy: 0.9908 Epoch 835/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.3566 - accuracy: 0.9973 Epoch 836/1000 9/9 [==============================] - 6s 664ms/step - loss: 0.3543 - accuracy: 1.0000 Epoch 837/1000 9/9 [==============================] - 6s 670ms/step - loss: 0.3508 - accuracy: 0.9958 Epoch 838/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.3493 - accuracy: 0.9939 Epoch 839/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3579 - accuracy: 0.9993 Epoch 840/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3725 - accuracy: 1.0000 Epoch 841/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.3441 - accuracy: 1.0000 Epoch 842/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3406 - accuracy: 1.0000 Epoch 843/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3431 - accuracy: 1.0000 Epoch 844/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3373 - accuracy: 1.0000 Epoch 845/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3424 - accuracy: 1.0000 Epoch 846/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.3291 - accuracy: 1.0000 Epoch 847/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.3361 - accuracy: 1.0000 Epoch 848/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.3376 - accuracy: 1.0000 Epoch 849/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.3363 - accuracy: 0.9955 Epoch 850/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.3423 - accuracy: 0.9955 Epoch 851/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.3345 - accuracy: 1.0000 Epoch 852/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3288 - accuracy: 1.0000 Epoch 853/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.3326 - accuracy: 1.0000 Epoch 854/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3251 - accuracy: 1.0000 Epoch 855/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3205 - accuracy: 1.0000 Epoch 856/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3122 - accuracy: 1.0000 Epoch 857/1000 9/9 [==============================] - 6s 662ms/step - loss: 0.3328 - accuracy: 1.0000 Epoch 858/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.3164 - accuracy: 1.0000 Epoch 859/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.3252 - accuracy: 1.0000 Epoch 860/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.3274 - accuracy: 1.0000 Epoch 861/1000 9/9 [==============================] - 6s 664ms/step - loss: 0.3181 - accuracy: 1.0000 Epoch 862/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3193 - accuracy: 0.9985 Epoch 863/1000 9/9 [==============================] - 6s 633ms/step - loss: 0.3196 - accuracy: 1.0000 Epoch 864/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3073 - accuracy: 1.0000 Epoch 865/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.3103 - accuracy: 1.0000 Epoch 866/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.3075 - accuracy: 1.0000 Epoch 867/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.3085 - accuracy: 1.0000 Epoch 868/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.3191 - accuracy: 1.0000 Epoch 869/1000 9/9 [==============================] - 6s 646ms/step - loss: 0.3146 - accuracy: 1.0000 Epoch 870/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3013 - accuracy: 1.0000 Epoch 871/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2994 - accuracy: 1.0000 Epoch 872/1000 9/9 [==============================] - 6s 658ms/step - loss: 0.3129 - accuracy: 1.0000 Epoch 873/1000 9/9 [==============================] - 6s 642ms/step - loss: 0.3014 - accuracy: 1.0000 Epoch 874/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.3095 - accuracy: 1.0000 Epoch 875/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.3130 - accuracy: 1.0000 Epoch 876/1000 9/9 [==============================] - 6s 616ms/step - loss: 0.3129 - accuracy: 1.0000 Epoch 877/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.3030 - accuracy: 1.0000 Epoch 878/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.3056 - accuracy: 1.0000 Epoch 879/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2984 - accuracy: 1.0000 Epoch 880/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.2994 - accuracy: 1.0000 Epoch 881/1000 9/9 [==============================] - 6s 641ms/step - loss: 0.2999 - accuracy: 1.0000 Epoch 882/1000 9/9 [==============================] - 6s 675ms/step - loss: 0.2910 - accuracy: 1.0000 Epoch 883/1000 9/9 [==============================] - 6s 675ms/step - loss: 0.2951 - accuracy: 1.0000 Epoch 884/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.2915 - accuracy: 1.0000 Epoch 885/1000 9/9 [==============================] - 6s 695ms/step - loss: 0.2803 - accuracy: 1.0000 Epoch 886/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.2919 - accuracy: 1.0000 Epoch 887/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.2962 - accuracy: 1.0000 Epoch 888/1000 9/9 [==============================] - 6s 692ms/step - loss: 0.2862 - accuracy: 1.0000 Epoch 889/1000 9/9 [==============================] - 6s 675ms/step - loss: 0.2886 - accuracy: 1.0000 Epoch 890/1000 9/9 [==============================] - 6s 634ms/step - loss: 0.2969 - accuracy: 1.0000 Epoch 891/1000 9/9 [==============================] - 6s 656ms/step - loss: 0.2896 - accuracy: 1.0000 Epoch 892/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2965 - accuracy: 1.0000 Epoch 893/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.2896 - accuracy: 1.0000 Epoch 894/1000 9/9 [==============================] - 6s 701ms/step - loss: 0.2977 - accuracy: 1.0000 Epoch 895/1000 9/9 [==============================] - 6s 625ms/step - loss: 0.2883 - accuracy: 1.0000 Epoch 896/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.2862 - accuracy: 1.0000 Epoch 897/1000 9/9 [==============================] - 6s 659ms/step - loss: 0.2835 - accuracy: 1.0000 Epoch 898/1000 9/9 [==============================] - 6s 666ms/step - loss: 0.2891 - accuracy: 1.0000 Epoch 899/1000 9/9 [==============================] - 6s 660ms/step - loss: 0.2902 - accuracy: 1.0000 Epoch 900/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2809 - accuracy: 1.0000 Epoch 901/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.2803 - accuracy: 1.0000 Epoch 902/1000 9/9 [==============================] - 6s 661ms/step - loss: 0.2735 - accuracy: 1.0000 Epoch 903/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2788 - accuracy: 1.0000 Epoch 904/1000 9/9 [==============================] - 6s 664ms/step - loss: 0.2817 - accuracy: 1.0000 Epoch 905/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.2892 - accuracy: 1.0000 Epoch 906/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2790 - accuracy: 1.0000 Epoch 907/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.2798 - accuracy: 1.0000 Epoch 908/1000 9/9 [==============================] - 6s 660ms/step - loss: 0.2723 - accuracy: 1.0000 Epoch 909/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2786 - accuracy: 1.0000 Epoch 910/1000 9/9 [==============================] - 6s 638ms/step - loss: 0.2734 - accuracy: 1.0000 Epoch 911/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.2843 - accuracy: 1.0000 Epoch 912/1000 9/9 [==============================] - 6s 672ms/step - loss: 0.2719 - accuracy: 1.0000 Epoch 913/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2712 - accuracy: 1.0000 Epoch 914/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2699 - accuracy: 0.9985 Epoch 915/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.2679 - accuracy: 1.0000 Epoch 916/1000 9/9 [==============================] - 6s 660ms/step - loss: 0.2771 - accuracy: 1.0000 Epoch 917/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.2688 - accuracy: 1.0000 Epoch 918/1000 9/9 [==============================] - 6s 663ms/step - loss: 0.2712 - accuracy: 1.0000 Epoch 919/1000 9/9 [==============================] - 6s 625ms/step - loss: 0.2680 - accuracy: 1.0000 Epoch 920/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.2610 - accuracy: 1.0000 Epoch 921/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2610 - accuracy: 1.0000 Epoch 922/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.2745 - accuracy: 1.0000 Epoch 923/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2603 - accuracy: 1.0000 Epoch 924/1000 9/9 [==============================] - 6s 648ms/step - loss: 0.2585 - accuracy: 1.0000 Epoch 925/1000 9/9 [==============================] - 6s 655ms/step - loss: 0.2606 - accuracy: 1.0000 Epoch 926/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2648 - accuracy: 1.0000 Epoch 927/1000 9/9 [==============================] - 6s 641ms/step - loss: 0.2663 - accuracy: 1.0000 Epoch 928/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.2668 - accuracy: 1.0000 Epoch 929/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2652 - accuracy: 1.0000 Epoch 930/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.2595 - accuracy: 1.0000 Epoch 931/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2564 - accuracy: 1.0000 Epoch 932/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2512 - accuracy: 1.0000 Epoch 933/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.2523 - accuracy: 1.0000 Epoch 934/1000 9/9 [==============================] - 6s 623ms/step - loss: 0.2608 - accuracy: 1.0000 Epoch 935/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.2582 - accuracy: 1.0000 Epoch 936/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2521 - accuracy: 1.0000 Epoch 937/1000 9/9 [==============================] - 6s 630ms/step - loss: 0.2558 - accuracy: 1.0000 Epoch 938/1000 9/9 [==============================] - 6s 692ms/step - loss: 0.2614 - accuracy: 1.0000 Epoch 939/1000 9/9 [==============================] - 6s 710ms/step - loss: 0.2516 - accuracy: 1.0000 Epoch 940/1000 9/9 [==============================] - 6s 619ms/step - loss: 0.2577 - accuracy: 1.0000 Epoch 941/1000 9/9 [==============================] - 6s 673ms/step - loss: 0.2462 - accuracy: 1.0000 Epoch 942/1000 9/9 [==============================] - 7s 737ms/step - loss: 0.2516 - accuracy: 1.0000 Epoch 943/1000 9/9 [==============================] - 6s 699ms/step - loss: 0.2498 - accuracy: 1.0000 Epoch 944/1000 9/9 [==============================] - 6s 669ms/step - loss: 0.2458 - accuracy: 1.0000 Epoch 945/1000 9/9 [==============================] - 6s 632ms/step - loss: 0.2478 - accuracy: 1.0000 Epoch 946/1000 9/9 [==============================] - 6s 718ms/step - loss: 0.2578 - accuracy: 1.0000 Epoch 947/1000 9/9 [==============================] - 6s 672ms/step - loss: 0.2469 - accuracy: 1.0000 Epoch 948/1000 9/9 [==============================] - 7s 729ms/step - loss: 0.2554 - accuracy: 1.0000 Epoch 949/1000 9/9 [==============================] - 6s 622ms/step - loss: 0.2493 - accuracy: 1.0000 Epoch 950/1000 9/9 [==============================] - 6s 640ms/step - loss: 0.2548 - accuracy: 1.0000 Epoch 951/1000 9/9 [==============================] - 7s 736ms/step - loss: 0.2441 - accuracy: 1.0000 Epoch 952/1000 9/9 [==============================] - 6s 624ms/step - loss: 0.2431 - accuracy: 1.0000 Epoch 953/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2435 - accuracy: 1.0000 Epoch 954/1000 9/9 [==============================] - 6s 657ms/step - loss: 0.2425 - accuracy: 1.0000 Epoch 955/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2506 - accuracy: 1.0000 Epoch 956/1000 9/9 [==============================] - 6s 621ms/step - loss: 0.2438 - accuracy: 1.0000 Epoch 957/1000 9/9 [==============================] - 6s 671ms/step - loss: 0.2526 - accuracy: 1.0000 Epoch 958/1000 9/9 [==============================] - 6s 700ms/step - loss: 0.2474 - accuracy: 1.0000 Epoch 959/1000 9/9 [==============================] - 6s 667ms/step - loss: 0.2450 - accuracy: 1.0000 Epoch 960/1000 9/9 [==============================] - 6s 637ms/step - loss: 0.2425 - accuracy: 1.0000 Epoch 961/1000 9/9 [==============================] - 6s 668ms/step - loss: 0.2457 - accuracy: 1.0000 Epoch 962/1000 9/9 [==============================] - 6s 694ms/step - loss: 0.2415 - accuracy: 1.0000 Epoch 963/1000 9/9 [==============================] - 6s 623ms/step - loss: 0.2388 - accuracy: 1.0000 Epoch 964/1000 9/9 [==============================] - 6s 653ms/step - loss: 0.2345 - accuracy: 1.0000 Epoch 965/1000 9/9 [==============================] - 6s 654ms/step - loss: 0.2401 - accuracy: 1.0000 Epoch 966/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2337 - accuracy: 1.0000 Epoch 967/1000 9/9 [==============================] - 6s 647ms/step - loss: 0.2422 - accuracy: 1.0000 Epoch 968/1000 9/9 [==============================] - 6s 618ms/step - loss: 0.2319 - accuracy: 1.0000 Epoch 969/1000 9/9 [==============================] - 6s 644ms/step - loss: 0.2310 - accuracy: 1.0000 Epoch 970/1000 9/9 [==============================] - 6s 649ms/step - loss: 0.2312 - accuracy: 1.0000 Epoch 971/1000 9/9 [==============================] - 6s 651ms/step - loss: 0.2357 - accuracy: 1.0000 Epoch 972/1000 9/9 [==============================] - 6s 645ms/step - loss: 0.2326 - accuracy: 1.0000 Epoch 973/1000 9/9 [==============================] - 6s 631ms/step - loss: 0.2368 - accuracy: 1.0000 Epoch 974/1000 9/9 [==============================] - 6s 628ms/step - loss: 0.2301 - accuracy: 1.0000 Epoch 975/1000 9/9 [==============================] - 6s 708ms/step - loss: 0.2311 - accuracy: 1.0000 Epoch 976/1000 9/9 [==============================] - 7s 772ms/step - loss: 0.2282 - accuracy: 1.0000 Epoch 977/1000 9/9 [==============================] - 8s 899ms/step - loss: 0.2352 - accuracy: 1.0000 Epoch 978/1000 9/9 [==============================] - 7s 683ms/step - loss: 0.2288 - accuracy: 1.0000 Epoch 979/1000 9/9 [==============================] - 6s 704ms/step - loss: 0.2298 - accuracy: 1.0000 Epoch 980/1000 9/9 [==============================] - 7s 724ms/step - loss: 0.2262 - accuracy: 1.0000 Epoch 981/1000 9/9 [==============================] - 6s 666ms/step - loss: 0.2336 - accuracy: 1.0000 Epoch 982/1000 9/9 [==============================] - 6s 706ms/step - loss: 0.2218 - accuracy: 1.0000 Epoch 983/1000 9/9 [==============================] - 6s 684ms/step - loss: 0.2230 - accuracy: 1.0000 Epoch 984/1000 9/9 [==============================] - 6s 687ms/step - loss: 0.2300 - accuracy: 1.0000 Epoch 985/1000 9/9 [==============================] - 6s 674ms/step - loss: 0.2251 - accuracy: 1.0000 Epoch 986/1000 9/9 [==============================] - 7s 735ms/step - loss: 0.2266 - accuracy: 1.0000 Epoch 987/1000 9/9 [==============================] - 6s 705ms/step - loss: 0.2223 - accuracy: 1.0000 Epoch 988/1000 9/9 [==============================] - 9s 1s/step - loss: 0.2230 - accuracy: 1.0000 Epoch 989/1000 9/9 [==============================] - 7s 748ms/step - loss: 0.2279 - accuracy: 1.0000 Epoch 990/1000 9/9 [==============================] - 6s 674ms/step - loss: 0.2218 - accuracy: 1.0000 Epoch 991/1000 9/9 [==============================] - 6s 668ms/step - loss: 0.2220 - accuracy: 1.0000 Epoch 992/1000 9/9 [==============================] - 6s 686ms/step - loss: 0.2218 - accuracy: 1.0000 Epoch 993/1000 9/9 [==============================] - 6s 700ms/step - loss: 0.2217 - accuracy: 1.0000 Epoch 994/1000 9/9 [==============================] - 6s 652ms/step - loss: 0.2230 - accuracy: 1.0000 Epoch 995/1000 9/9 [==============================] - 6s 716ms/step - loss: 0.2282 - accuracy: 1.0000 Epoch 996/1000 9/9 [==============================] - 6s 650ms/step - loss: 0.2232 - accuracy: 1.0000 Epoch 997/1000 9/9 [==============================] - 7s 722ms/step - loss: 0.2230 - accuracy: 1.0000 Epoch 998/1000 9/9 [==============================] - 6s 708ms/step - loss: 0.2236 - accuracy: 1.0000 Epoch 999/1000 9/9 [==============================] - 7s 742ms/step - loss: 0.2290 - accuracy: 1.0000 Epoch 1000/1000 9/9 [==============================] - 7s 737ms/step - loss: 0.2296 - accuracy: 1.0000
import matplotlib.pyplot as plt
acc = history.history['accuracy']
loss = history.history['loss']
epochs = range(len(acc))
plt.plot(epochs, acc, 'b', label='Training accuracy')
plt.title('Training accuracy')
plt.figure()
plt.plot(epochs, loss, 'b', label='Training Loss')
plt.title('Training loss')
plt.legend()
plt.show()
seed_text = "She provided him with"
next_words = 10
for _ in range(next_words):
token_list = tokenizer.texts_to_sequences([seed_text])[0]
token_list = pad_sequences([token_list], maxlen=max_sequence_len-1, padding='pre')
predicted = model.predict_classes(token_list, verbose=0)
output_word = ""
for word, index in tokenizer.word_index.items():
if index == predicted:
output_word = word
break
seed_text += " " + output_word
print(seed_text)
She provided him with not because he cowardly and abject quite the contrary but
It seems that including attention did not improve much the LSTM performance.